Mar 01 09:07:49 crc systemd[1]: Starting Kubernetes Kubelet... Mar 01 09:07:49 crc restorecon[4643]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:49 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 01 09:07:50 crc restorecon[4643]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 01 09:07:50 crc restorecon[4643]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 01 09:07:51 crc kubenswrapper[4792]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 01 09:07:51 crc kubenswrapper[4792]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 01 09:07:51 crc kubenswrapper[4792]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 01 09:07:51 crc kubenswrapper[4792]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 01 09:07:51 crc kubenswrapper[4792]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 01 09:07:51 crc kubenswrapper[4792]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.171124 4792 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174120 4792 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174140 4792 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174145 4792 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174149 4792 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174154 4792 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174158 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174162 4792 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174167 4792 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174171 4792 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174176 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174181 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174187 4792 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174191 4792 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174196 4792 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174201 4792 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174207 4792 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174216 4792 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174220 4792 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174224 4792 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174228 4792 feature_gate.go:330] unrecognized feature gate: Example Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174234 4792 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174240 4792 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174245 4792 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174250 4792 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174254 4792 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174258 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174263 4792 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174266 4792 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174270 4792 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174274 4792 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174279 4792 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174283 4792 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174289 4792 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174293 4792 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174297 4792 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174301 4792 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174305 4792 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174310 4792 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174314 4792 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174318 4792 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174322 4792 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174326 4792 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174330 4792 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174335 4792 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174340 4792 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174345 4792 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174349 4792 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174354 4792 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174358 4792 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174362 4792 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174365 4792 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174369 4792 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174373 4792 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174376 4792 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174380 4792 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174383 4792 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174387 4792 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174390 4792 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174394 4792 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174397 4792 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174401 4792 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174405 4792 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174408 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174411 4792 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174415 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174418 4792 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174421 4792 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174425 4792 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174428 4792 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174432 4792 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.174435 4792 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175208 4792 flags.go:64] FLAG: --address="0.0.0.0" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175222 4792 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175233 4792 flags.go:64] FLAG: --anonymous-auth="true" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175243 4792 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175249 4792 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175253 4792 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175259 4792 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175264 4792 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175268 4792 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175275 4792 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175280 4792 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175284 4792 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175289 4792 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175293 4792 flags.go:64] FLAG: --cgroup-root="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175297 4792 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175301 4792 flags.go:64] FLAG: --client-ca-file="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175305 4792 flags.go:64] FLAG: --cloud-config="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175309 4792 flags.go:64] FLAG: --cloud-provider="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175313 4792 flags.go:64] FLAG: --cluster-dns="[]" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175319 4792 flags.go:64] FLAG: --cluster-domain="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175323 4792 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175327 4792 flags.go:64] FLAG: --config-dir="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175331 4792 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175336 4792 flags.go:64] FLAG: --container-log-max-files="5" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175342 4792 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175364 4792 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175369 4792 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175375 4792 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175380 4792 flags.go:64] FLAG: --contention-profiling="false" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175384 4792 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175388 4792 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175392 4792 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175396 4792 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175402 4792 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175406 4792 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175412 4792 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175416 4792 flags.go:64] FLAG: --enable-load-reader="false" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175420 4792 flags.go:64] FLAG: --enable-server="true" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175425 4792 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175430 4792 flags.go:64] FLAG: --event-burst="100" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175435 4792 flags.go:64] FLAG: --event-qps="50" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175439 4792 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175443 4792 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175447 4792 flags.go:64] FLAG: --eviction-hard="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175453 4792 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175457 4792 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175461 4792 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175465 4792 flags.go:64] FLAG: --eviction-soft="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175469 4792 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175473 4792 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175478 4792 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175482 4792 flags.go:64] FLAG: --experimental-mounter-path="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175486 4792 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175490 4792 flags.go:64] FLAG: --fail-swap-on="true" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175494 4792 flags.go:64] FLAG: --feature-gates="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175499 4792 flags.go:64] FLAG: --file-check-frequency="20s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175503 4792 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175507 4792 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175512 4792 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175517 4792 flags.go:64] FLAG: --healthz-port="10248" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175521 4792 flags.go:64] FLAG: --help="false" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175525 4792 flags.go:64] FLAG: --hostname-override="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175529 4792 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175533 4792 flags.go:64] FLAG: --http-check-frequency="20s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175538 4792 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175542 4792 flags.go:64] FLAG: --image-credential-provider-config="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175546 4792 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175550 4792 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175554 4792 flags.go:64] FLAG: --image-service-endpoint="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175558 4792 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175563 4792 flags.go:64] FLAG: --kube-api-burst="100" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175568 4792 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175573 4792 flags.go:64] FLAG: --kube-api-qps="50" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175578 4792 flags.go:64] FLAG: --kube-reserved="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175585 4792 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175590 4792 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175595 4792 flags.go:64] FLAG: --kubelet-cgroups="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175600 4792 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175605 4792 flags.go:64] FLAG: --lock-file="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175610 4792 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175615 4792 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175620 4792 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175628 4792 flags.go:64] FLAG: --log-json-split-stream="false" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175632 4792 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175637 4792 flags.go:64] FLAG: --log-text-split-stream="false" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175643 4792 flags.go:64] FLAG: --logging-format="text" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175648 4792 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175653 4792 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175659 4792 flags.go:64] FLAG: --manifest-url="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175664 4792 flags.go:64] FLAG: --manifest-url-header="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175671 4792 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175676 4792 flags.go:64] FLAG: --max-open-files="1000000" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175681 4792 flags.go:64] FLAG: --max-pods="110" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175685 4792 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175689 4792 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175693 4792 flags.go:64] FLAG: --memory-manager-policy="None" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175697 4792 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175703 4792 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175707 4792 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175712 4792 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175723 4792 flags.go:64] FLAG: --node-status-max-images="50" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175728 4792 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175732 4792 flags.go:64] FLAG: --oom-score-adj="-999" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175736 4792 flags.go:64] FLAG: --pod-cidr="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175740 4792 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175747 4792 flags.go:64] FLAG: --pod-manifest-path="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175752 4792 flags.go:64] FLAG: --pod-max-pids="-1" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175756 4792 flags.go:64] FLAG: --pods-per-core="0" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175760 4792 flags.go:64] FLAG: --port="10250" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175764 4792 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175769 4792 flags.go:64] FLAG: --provider-id="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175773 4792 flags.go:64] FLAG: --qos-reserved="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175778 4792 flags.go:64] FLAG: --read-only-port="10255" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175782 4792 flags.go:64] FLAG: --register-node="true" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175786 4792 flags.go:64] FLAG: --register-schedulable="true" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175790 4792 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175798 4792 flags.go:64] FLAG: --registry-burst="10" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175802 4792 flags.go:64] FLAG: --registry-qps="5" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175806 4792 flags.go:64] FLAG: --reserved-cpus="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175810 4792 flags.go:64] FLAG: --reserved-memory="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175815 4792 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175819 4792 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175823 4792 flags.go:64] FLAG: --rotate-certificates="false" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175827 4792 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175831 4792 flags.go:64] FLAG: --runonce="false" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175835 4792 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175839 4792 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175844 4792 flags.go:64] FLAG: --seccomp-default="false" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175848 4792 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175853 4792 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175858 4792 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175863 4792 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175869 4792 flags.go:64] FLAG: --storage-driver-password="root" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175874 4792 flags.go:64] FLAG: --storage-driver-secure="false" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175894 4792 flags.go:64] FLAG: --storage-driver-table="stats" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175899 4792 flags.go:64] FLAG: --storage-driver-user="root" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175918 4792 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175924 4792 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175929 4792 flags.go:64] FLAG: --system-cgroups="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175933 4792 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175940 4792 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175944 4792 flags.go:64] FLAG: --tls-cert-file="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175948 4792 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175954 4792 flags.go:64] FLAG: --tls-min-version="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175958 4792 flags.go:64] FLAG: --tls-private-key-file="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175962 4792 flags.go:64] FLAG: --topology-manager-policy="none" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175967 4792 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175971 4792 flags.go:64] FLAG: --topology-manager-scope="container" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175976 4792 flags.go:64] FLAG: --v="2" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175982 4792 flags.go:64] FLAG: --version="false" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175987 4792 flags.go:64] FLAG: --vmodule="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175993 4792 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.175997 4792 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176104 4792 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176110 4792 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176114 4792 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176118 4792 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176122 4792 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176126 4792 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176131 4792 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176135 4792 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176138 4792 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176142 4792 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176147 4792 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176152 4792 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176156 4792 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176159 4792 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176163 4792 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176167 4792 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176172 4792 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176176 4792 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176181 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176185 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176188 4792 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176192 4792 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176197 4792 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176201 4792 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176206 4792 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176210 4792 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176214 4792 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176219 4792 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176223 4792 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176228 4792 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176232 4792 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176236 4792 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176240 4792 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176244 4792 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176248 4792 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176252 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176255 4792 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176259 4792 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176263 4792 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176267 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176270 4792 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176274 4792 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176277 4792 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176281 4792 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176284 4792 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176288 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176292 4792 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176298 4792 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176302 4792 feature_gate.go:330] unrecognized feature gate: Example Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176306 4792 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176310 4792 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176313 4792 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176317 4792 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176321 4792 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176328 4792 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176332 4792 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176337 4792 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176340 4792 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176345 4792 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176349 4792 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176352 4792 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176356 4792 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176359 4792 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176363 4792 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176366 4792 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176370 4792 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176374 4792 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176378 4792 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176381 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176388 4792 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.176392 4792 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.176398 4792 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.187096 4792 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.187636 4792 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187788 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187803 4792 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187811 4792 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187819 4792 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187828 4792 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187836 4792 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187842 4792 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187849 4792 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187856 4792 feature_gate.go:330] unrecognized feature gate: Example Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187863 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187870 4792 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187878 4792 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187884 4792 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187892 4792 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187899 4792 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187928 4792 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187937 4792 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187944 4792 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187950 4792 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187957 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187967 4792 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187974 4792 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187981 4792 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187988 4792 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.187995 4792 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188003 4792 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188010 4792 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188017 4792 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188025 4792 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188031 4792 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188039 4792 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188047 4792 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188054 4792 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188062 4792 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188068 4792 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188078 4792 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188089 4792 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188097 4792 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188106 4792 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188113 4792 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188121 4792 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188128 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188136 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188143 4792 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188151 4792 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188161 4792 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188169 4792 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188176 4792 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188185 4792 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188192 4792 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188198 4792 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188205 4792 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188212 4792 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188221 4792 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188229 4792 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188236 4792 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188246 4792 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188253 4792 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188259 4792 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188266 4792 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188274 4792 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188281 4792 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188288 4792 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188295 4792 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188302 4792 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188310 4792 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188316 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188323 4792 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188330 4792 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188336 4792 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188345 4792 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.188357 4792 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188570 4792 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188585 4792 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188593 4792 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188602 4792 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188611 4792 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188619 4792 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188626 4792 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188633 4792 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188640 4792 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188647 4792 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188654 4792 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188661 4792 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188668 4792 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188675 4792 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188681 4792 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188689 4792 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188696 4792 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188703 4792 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188709 4792 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188716 4792 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188725 4792 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188732 4792 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188738 4792 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188745 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188752 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188760 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188768 4792 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188777 4792 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188785 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188792 4792 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188800 4792 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188807 4792 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188813 4792 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188820 4792 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188827 4792 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188834 4792 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188841 4792 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188848 4792 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188854 4792 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188861 4792 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188868 4792 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188875 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188882 4792 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188889 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188898 4792 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188931 4792 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188941 4792 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188950 4792 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188957 4792 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188963 4792 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188970 4792 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188977 4792 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188985 4792 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188991 4792 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.188998 4792 feature_gate.go:330] unrecognized feature gate: Example Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.189005 4792 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.189014 4792 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.189021 4792 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.189028 4792 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.189034 4792 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.189041 4792 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.189050 4792 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.189059 4792 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.189066 4792 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.189073 4792 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.189081 4792 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.189087 4792 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.189095 4792 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.189104 4792 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.189112 4792 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.189119 4792 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.189132 4792 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.190282 4792 server.go:940] "Client rotation is on, will bootstrap in background" Mar 01 09:07:51 crc kubenswrapper[4792]: E0301 09:07:51.196114 4792 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.202605 4792 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.202824 4792 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.204800 4792 server.go:997] "Starting client certificate rotation" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.204854 4792 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.204997 4792 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.234859 4792 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 01 09:07:51 crc kubenswrapper[4792]: E0301 09:07:51.237303 4792 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.239748 4792 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.256578 4792 log.go:25] "Validated CRI v1 runtime API" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.292888 4792 log.go:25] "Validated CRI v1 image API" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.295814 4792 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.301485 4792 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-01-09-01-56-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.301524 4792 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.313382 4792 manager.go:217] Machine: {Timestamp:2026-03-01 09:07:51.312058072 +0000 UTC m=+0.553937269 CPUVendorID:AuthenticAMD NumCores:8 NumPhysicalCores:1 NumSockets:8 CpuFrequency:2800000 MemoryCapacity:25199472640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:7013d830-7d29-4a03-853d-b832509642d4 BootID:10ee72b7-c3f1-449a-bf55-34c8d2b9c7af Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:12599738368 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:2519945216 Type:vfs Inodes:615221 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:3076107 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:12599734272 Type:vfs Inodes:3076107 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:5039894528 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:429496729600 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:05:dc:1a Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:05:dc:1a Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:84:8f:7c Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:a1:4b:44 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:71:73:f9 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:68:72:88 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:e8:05:b1 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:ee:ea:05:c0:de:9e Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:66:81:6d:65:90:48 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:25199472640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.313568 4792 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.313784 4792 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.314075 4792 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.314268 4792 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.314306 4792 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.315289 4792 topology_manager.go:138] "Creating topology manager with none policy" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.315310 4792 container_manager_linux.go:303] "Creating device plugin manager" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.315834 4792 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.315859 4792 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.316727 4792 state_mem.go:36] "Initialized new in-memory state store" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.317133 4792 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.320856 4792 kubelet.go:418] "Attempting to sync node with API server" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.320894 4792 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.320939 4792 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.320954 4792 kubelet.go:324] "Adding apiserver pod source" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.320968 4792 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.325743 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Mar 01 09:07:51 crc kubenswrapper[4792]: E0301 09:07:51.325860 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.326469 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Mar 01 09:07:51 crc kubenswrapper[4792]: E0301 09:07:51.326621 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.328049 4792 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.330072 4792 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.332673 4792 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.339880 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.340184 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.340199 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.340208 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.340223 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.340237 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.340248 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.340264 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.340277 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.340288 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.340303 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.340312 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.341600 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.343647 4792 server.go:1280] "Started kubelet" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.343975 4792 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.344694 4792 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.345527 4792 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.346212 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Mar 01 09:07:51 crc systemd[1]: Started Kubernetes Kubelet. Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.348983 4792 server.go:460] "Adding debug handlers to kubelet server" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.350182 4792 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.350219 4792 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.350827 4792 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.351269 4792 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.351450 4792 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.351603 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.351664 4792 factory.go:55] Registering systemd factory Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.351721 4792 factory.go:221] Registration of the systemd container factory successfully Mar 01 09:07:51 crc kubenswrapper[4792]: E0301 09:07:51.351661 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Mar 01 09:07:51 crc kubenswrapper[4792]: E0301 09:07:51.351720 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:07:51 crc kubenswrapper[4792]: E0301 09:07:51.351920 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="200ms" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.353888 4792 factory.go:153] Registering CRI-O factory Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.353945 4792 factory.go:221] Registration of the crio container factory successfully Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.354052 4792 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.354085 4792 factory.go:103] Registering Raw factory Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.354102 4792 manager.go:1196] Started watching for new ooms in manager Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.354920 4792 manager.go:319] Starting recovery of all containers Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.364174 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.364239 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.364253 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.364264 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.364289 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.364300 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.364314 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.364422 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.364439 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.364453 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.364462 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.364479 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.364490 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.364568 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.364579 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.368785 4792 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.368873 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.368896 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.370943 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.370986 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371001 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371019 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: E0301 09:07:51.361882 4792 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.89:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1898ac74dff1661c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.343613468 +0000 UTC m=+0.585492665,LastTimestamp:2026-03-01 09:07:51.343613468 +0000 UTC m=+0.585492665,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371033 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371101 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371118 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371149 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371164 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371184 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371198 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371214 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371225 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371305 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371317 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371344 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371365 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371375 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371407 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371417 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371445 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371459 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371470 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371490 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371507 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371522 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371534 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371543 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371553 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371595 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371627 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371673 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371686 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371699 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371716 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371743 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371791 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371882 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371946 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371963 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371980 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.371993 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.372034 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373355 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373420 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373434 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373446 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373458 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373469 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373479 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373492 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373508 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373522 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373535 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373548 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373562 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373575 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373596 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373656 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373672 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373686 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373706 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373720 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373737 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373750 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373766 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373781 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373793 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373805 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373818 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373832 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373845 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373867 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373888 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373903 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373938 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373955 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373975 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.373989 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374001 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374017 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374030 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374062 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374075 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374089 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374107 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374124 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374152 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374169 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374208 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374223 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374239 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374258 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374271 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374287 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374301 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374318 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374342 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374360 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374376 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374390 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374405 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374418 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374431 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374445 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374484 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374506 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374524 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374548 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374563 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374576 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374590 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374605 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374618 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374633 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374645 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374680 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374696 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374710 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374755 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374772 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374785 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374800 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374813 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374829 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374846 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374859 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374875 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374889 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374932 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374947 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374961 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374977 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.374992 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375006 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375021 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375035 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375048 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375065 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375079 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375101 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375120 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375143 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375160 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375198 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375214 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375228 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375250 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375268 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375283 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375304 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375320 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375334 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375347 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375362 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375376 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375389 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375402 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375426 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375445 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375465 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375487 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375501 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375535 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375547 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375561 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375577 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375591 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375605 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375618 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375633 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375648 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375661 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375675 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375687 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375702 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375720 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375734 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375748 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375762 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375776 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375799 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375815 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375832 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375852 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375866 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375885 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375898 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375926 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375958 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375970 4792 reconstruct.go:97] "Volume reconstruction finished" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.375980 4792 reconciler.go:26] "Reconciler: start to sync state" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.380861 4792 manager.go:324] Recovery completed Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.391733 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.393300 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.393349 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.393384 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.394139 4792 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.394159 4792 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.394178 4792 state_mem.go:36] "Initialized new in-memory state store" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.405593 4792 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.407449 4792 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.407495 4792 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.407525 4792 kubelet.go:2335] "Starting kubelet main sync loop" Mar 01 09:07:51 crc kubenswrapper[4792]: E0301 09:07:51.407570 4792 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.408665 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Mar 01 09:07:51 crc kubenswrapper[4792]: E0301 09:07:51.408741 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.428007 4792 policy_none.go:49] "None policy: Start" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.428968 4792 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.429041 4792 state_mem.go:35] "Initializing new in-memory state store" Mar 01 09:07:51 crc kubenswrapper[4792]: E0301 09:07:51.452116 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.483023 4792 manager.go:334] "Starting Device Plugin manager" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.483109 4792 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.483126 4792 server.go:79] "Starting device plugin registration server" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.483675 4792 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.483693 4792 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.483890 4792 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.484015 4792 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.484022 4792 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 01 09:07:51 crc kubenswrapper[4792]: E0301 09:07:51.495927 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.507747 4792 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.507883 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.509011 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.509061 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.509072 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.509260 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.509469 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.509570 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.510507 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.510556 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.510568 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.510742 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.510891 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.510983 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.511061 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.511110 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.511133 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.512197 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.512301 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.512313 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.512363 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.512390 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.512421 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.512723 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.513206 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.513344 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.514204 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.514232 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.514249 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.514543 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.514662 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.514700 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.515388 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.515422 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.515434 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.518576 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.518599 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.518609 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.518749 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.518773 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.518783 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.518999 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.519033 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.520487 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.520513 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.520526 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:51 crc kubenswrapper[4792]: E0301 09:07:51.552901 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="400ms" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.578727 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.578772 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.578798 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.578815 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.578836 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.578854 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.578960 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.579028 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.579068 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.579125 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.579159 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.579193 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.579228 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.579263 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.579315 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.583802 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.585241 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.585271 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.585283 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.585307 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 01 09:07:51 crc kubenswrapper[4792]: E0301 09:07:51.585718 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.89:6443: connect: connection refused" node="crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.680656 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.680729 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.680762 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.680795 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.680818 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.680827 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.680866 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681051 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681086 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681001 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.680949 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681056 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681202 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.680982 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681233 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681278 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681280 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681302 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681328 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.680899 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681344 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681387 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681399 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681437 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681437 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681462 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681477 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681500 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681565 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.681695 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.786662 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.788387 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.788438 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.788457 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.788491 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 01 09:07:51 crc kubenswrapper[4792]: E0301 09:07:51.789076 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.89:6443: connect: connection refused" node="crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.863073 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.873733 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.894063 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.922603 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.927437 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-54de6d95ee1a53bd81473feb5d58e04aa62779a5735069ffba57159a05e89998 WatchSource:0}: Error finding container 54de6d95ee1a53bd81473feb5d58e04aa62779a5735069ffba57159a05e89998: Status 404 returned error can't find the container with id 54de6d95ee1a53bd81473feb5d58e04aa62779a5735069ffba57159a05e89998 Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.931899 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-7f6e6a0bb71eeadc54be97424d9bc94a4f1fde2ff23b9ce9c18316fe1533597e WatchSource:0}: Error finding container 7f6e6a0bb71eeadc54be97424d9bc94a4f1fde2ff23b9ce9c18316fe1533597e: Status 404 returned error can't find the container with id 7f6e6a0bb71eeadc54be97424d9bc94a4f1fde2ff23b9ce9c18316fe1533597e Mar 01 09:07:51 crc kubenswrapper[4792]: I0301 09:07:51.932532 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.937514 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-25ab059ec0587552073d7fa61eb3c0bd4797b13cbcca56593dab2ff0fc7bb158 WatchSource:0}: Error finding container 25ab059ec0587552073d7fa61eb3c0bd4797b13cbcca56593dab2ff0fc7bb158: Status 404 returned error can't find the container with id 25ab059ec0587552073d7fa61eb3c0bd4797b13cbcca56593dab2ff0fc7bb158 Mar 01 09:07:51 crc kubenswrapper[4792]: E0301 09:07:51.953755 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="800ms" Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.958024 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-a6dd73c06f97cd741b2cff04b9b032c55e67d970b5d44ad6a4fa216321f3aff7 WatchSource:0}: Error finding container a6dd73c06f97cd741b2cff04b9b032c55e67d970b5d44ad6a4fa216321f3aff7: Status 404 returned error can't find the container with id a6dd73c06f97cd741b2cff04b9b032c55e67d970b5d44ad6a4fa216321f3aff7 Mar 01 09:07:51 crc kubenswrapper[4792]: W0301 09:07:51.966077 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-9995264936e1cf89cc354c4ddff95af9290d403e2ce9aca130316680133c4376 WatchSource:0}: Error finding container 9995264936e1cf89cc354c4ddff95af9290d403e2ce9aca130316680133c4376: Status 404 returned error can't find the container with id 9995264936e1cf89cc354c4ddff95af9290d403e2ce9aca130316680133c4376 Mar 01 09:07:52 crc kubenswrapper[4792]: I0301 09:07:52.195136 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:52 crc kubenswrapper[4792]: I0301 09:07:52.202139 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:52 crc kubenswrapper[4792]: I0301 09:07:52.202194 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:52 crc kubenswrapper[4792]: I0301 09:07:52.202208 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:52 crc kubenswrapper[4792]: I0301 09:07:52.202242 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 01 09:07:52 crc kubenswrapper[4792]: E0301 09:07:52.202791 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.89:6443: connect: connection refused" node="crc" Mar 01 09:07:52 crc kubenswrapper[4792]: W0301 09:07:52.342324 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Mar 01 09:07:52 crc kubenswrapper[4792]: E0301 09:07:52.342421 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Mar 01 09:07:52 crc kubenswrapper[4792]: I0301 09:07:52.347428 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Mar 01 09:07:52 crc kubenswrapper[4792]: I0301 09:07:52.411890 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"7f6e6a0bb71eeadc54be97424d9bc94a4f1fde2ff23b9ce9c18316fe1533597e"} Mar 01 09:07:52 crc kubenswrapper[4792]: I0301 09:07:52.412849 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"54de6d95ee1a53bd81473feb5d58e04aa62779a5735069ffba57159a05e89998"} Mar 01 09:07:52 crc kubenswrapper[4792]: I0301 09:07:52.413764 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9995264936e1cf89cc354c4ddff95af9290d403e2ce9aca130316680133c4376"} Mar 01 09:07:52 crc kubenswrapper[4792]: I0301 09:07:52.416831 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a6dd73c06f97cd741b2cff04b9b032c55e67d970b5d44ad6a4fa216321f3aff7"} Mar 01 09:07:52 crc kubenswrapper[4792]: I0301 09:07:52.417967 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"25ab059ec0587552073d7fa61eb3c0bd4797b13cbcca56593dab2ff0fc7bb158"} Mar 01 09:07:52 crc kubenswrapper[4792]: W0301 09:07:52.418796 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Mar 01 09:07:52 crc kubenswrapper[4792]: E0301 09:07:52.418874 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Mar 01 09:07:52 crc kubenswrapper[4792]: W0301 09:07:52.688597 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Mar 01 09:07:52 crc kubenswrapper[4792]: E0301 09:07:52.688677 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Mar 01 09:07:52 crc kubenswrapper[4792]: W0301 09:07:52.734812 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Mar 01 09:07:52 crc kubenswrapper[4792]: E0301 09:07:52.734934 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Mar 01 09:07:52 crc kubenswrapper[4792]: E0301 09:07:52.755217 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="1.6s" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.003961 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.006231 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.006299 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.006318 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.006363 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 01 09:07:53 crc kubenswrapper[4792]: E0301 09:07:53.007038 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.89:6443: connect: connection refused" node="crc" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.336943 4792 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 01 09:07:53 crc kubenswrapper[4792]: E0301 09:07:53.337771 4792 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.347246 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.438156 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"72a213f48ea7aba6dc2e3a9aa8d171acb6aa5c22532eed941bdc337e89b76589"} Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.438243 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e93e0bcca6e83da45b6dd64db0083da5cc4045d25d7a66615d09f8674cc02433"} Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.438274 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.438276 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"684aa0b5aeaecf3fb572ab400e718956a5373c4fa6651e742d65bdbc3425b9b2"} Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.438536 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"859afd52205b95f45333b5975e67308ee360d3887aa679a922b7cba5e3acda4c"} Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.439849 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.439881 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.439894 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.441389 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada" exitCode=0 Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.441525 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.441552 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada"} Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.443795 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.443822 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.443832 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.444182 4792 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="89989fe138b858dc3592cf753044ffb1142921d214e92ea0cf407ba0a44790c0" exitCode=0 Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.444603 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.444601 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"89989fe138b858dc3592cf753044ffb1142921d214e92ea0cf407ba0a44790c0"} Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.452833 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.452873 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.452891 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.453332 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.454982 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.455020 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.455037 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.455785 4792 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="6faf5a3e146bbcf0a0ebd280fd8a537de025e388c0b5abd07452fed024a59d58" exitCode=0 Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.456025 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"6faf5a3e146bbcf0a0ebd280fd8a537de025e388c0b5abd07452fed024a59d58"} Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.456073 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.457934 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.457974 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.457989 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.458418 4792 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="5e046a8fc92c9bbbcba4c1a83c870045a2986787c2d49ec55bd3612f8a38d34f" exitCode=0 Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.458453 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"5e046a8fc92c9bbbcba4c1a83c870045a2986787c2d49ec55bd3612f8a38d34f"} Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.458558 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.459851 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.460097 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:53 crc kubenswrapper[4792]: I0301 09:07:53.460230 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:53 crc kubenswrapper[4792]: E0301 09:07:53.468737 4792 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.89:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1898ac74dff1661c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.343613468 +0000 UTC m=+0.585492665,LastTimestamp:2026-03-01 09:07:51.343613468 +0000 UTC m=+0.585492665,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:07:54 crc kubenswrapper[4792]: W0301 09:07:54.158234 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Mar 01 09:07:54 crc kubenswrapper[4792]: E0301 09:07:54.158310 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.347283 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Mar 01 09:07:54 crc kubenswrapper[4792]: E0301 09:07:54.356473 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="3.2s" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.464097 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"60fcc4fed210cb9af0325e0689663cbd1c6099e535847c37fe925b2262c1ba2d"} Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.464153 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.464956 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.464988 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.465005 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.470115 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"9b89e60e36b6e5a848c0d033d1ba3cfb1c58af2a1288bfd5612ade34b0c24e08"} Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.470375 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c81dff9998b1dc46b4b8cf890b0a1cf9f6201067af0aa512dfff4ad0c3688d69"} Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.470398 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"584d540504344f06dee541ba7491978c79d823f7bea306efe216719f32cfbc9a"} Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.470445 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.471830 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.471870 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.471880 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.475255 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48"} Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.475291 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1"} Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.475303 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7"} Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.475314 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610"} Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.477407 4792 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="fa59a983918262456d3ae53d3e2465ee0d2d5d20f175a152775ccf8cf961d4ac" exitCode=0 Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.477489 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.477504 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"fa59a983918262456d3ae53d3e2465ee0d2d5d20f175a152775ccf8cf961d4ac"} Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.477577 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.478872 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.478893 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.478915 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.478940 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.478958 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.478968 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.607332 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.608557 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.608598 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.608609 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:54 crc kubenswrapper[4792]: I0301 09:07:54.608636 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 01 09:07:54 crc kubenswrapper[4792]: E0301 09:07:54.609178 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.89:6443: connect: connection refused" node="crc" Mar 01 09:07:54 crc kubenswrapper[4792]: W0301 09:07:54.646990 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Mar 01 09:07:54 crc kubenswrapper[4792]: E0301 09:07:54.647069 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Mar 01 09:07:54 crc kubenswrapper[4792]: W0301 09:07:54.880208 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Mar 01 09:07:54 crc kubenswrapper[4792]: E0301 09:07:54.880277 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Mar 01 09:07:54 crc kubenswrapper[4792]: W0301 09:07:54.913484 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.89:6443: connect: connection refused Mar 01 09:07:54 crc kubenswrapper[4792]: E0301 09:07:54.913578 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.89:6443: connect: connection refused" logger="UnhandledError" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.482648 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"27e10bb6a0f100f119755de151dfea900c1268526a3fe0ee0f4f067654d2c3fb"} Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.482750 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.483793 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.483829 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.483843 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.484259 4792 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="dbb402d58001b334d03322f8dbd5127ef54e6f9f4c68a50f96678cfb0d18b0c1" exitCode=0 Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.484333 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"dbb402d58001b334d03322f8dbd5127ef54e6f9f4c68a50f96678cfb0d18b0c1"} Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.484385 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.484407 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.484431 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.484433 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.485308 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.485337 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.485354 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.485335 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.485399 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.485413 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.486101 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.486143 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.486163 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.498783 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.498978 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.499949 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.499979 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:55 crc kubenswrapper[4792]: I0301 09:07:55.499991 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:56 crc kubenswrapper[4792]: I0301 09:07:56.489622 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2fc17e8ed51e49812ecf9e98e9643efca15ff1ae0c651ce9ce595cab4b867835"} Mar 01 09:07:56 crc kubenswrapper[4792]: I0301 09:07:56.489706 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"261e0c49c66e0e2c75ff92754f75e47f491bd9bb111c21090588dc541a5bea7e"} Mar 01 09:07:56 crc kubenswrapper[4792]: I0301 09:07:56.489780 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"cde437ad36fd39f054e2cd44aaa14c04910c7b50c7fa9bd449fbc9473a90556e"} Mar 01 09:07:56 crc kubenswrapper[4792]: I0301 09:07:56.489708 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:56 crc kubenswrapper[4792]: I0301 09:07:56.489798 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:56 crc kubenswrapper[4792]: I0301 09:07:56.490349 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:07:56 crc kubenswrapper[4792]: I0301 09:07:56.490934 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:56 crc kubenswrapper[4792]: I0301 09:07:56.490959 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:56 crc kubenswrapper[4792]: I0301 09:07:56.490968 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:56 crc kubenswrapper[4792]: I0301 09:07:56.490968 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:56 crc kubenswrapper[4792]: I0301 09:07:56.491058 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:56 crc kubenswrapper[4792]: I0301 09:07:56.491069 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:57 crc kubenswrapper[4792]: I0301 09:07:57.498840 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"85b1d3f760090b536cbc905310656d527e974d4ebd5fc8624a6fce696ac2f62a"} Mar 01 09:07:57 crc kubenswrapper[4792]: I0301 09:07:57.498937 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"93314bca863dfc57ee7c0badd8485c0d785e94cac5c02f9299588aa1704961e4"} Mar 01 09:07:57 crc kubenswrapper[4792]: I0301 09:07:57.498947 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:57 crc kubenswrapper[4792]: I0301 09:07:57.499034 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:57 crc kubenswrapper[4792]: I0301 09:07:57.500228 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:57 crc kubenswrapper[4792]: I0301 09:07:57.500263 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:57 crc kubenswrapper[4792]: I0301 09:07:57.500274 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:57 crc kubenswrapper[4792]: I0301 09:07:57.500414 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:57 crc kubenswrapper[4792]: I0301 09:07:57.500449 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:57 crc kubenswrapper[4792]: I0301 09:07:57.500469 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:57 crc kubenswrapper[4792]: I0301 09:07:57.722953 4792 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 01 09:07:57 crc kubenswrapper[4792]: I0301 09:07:57.809985 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:57 crc kubenswrapper[4792]: I0301 09:07:57.811610 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:57 crc kubenswrapper[4792]: I0301 09:07:57.811654 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:57 crc kubenswrapper[4792]: I0301 09:07:57.811667 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:57 crc kubenswrapper[4792]: I0301 09:07:57.811696 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 01 09:07:58 crc kubenswrapper[4792]: I0301 09:07:58.408581 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 01 09:07:58 crc kubenswrapper[4792]: I0301 09:07:58.499553 4792 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 01 09:07:58 crc kubenswrapper[4792]: I0301 09:07:58.499667 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 01 09:07:58 crc kubenswrapper[4792]: I0301 09:07:58.501959 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:58 crc kubenswrapper[4792]: I0301 09:07:58.503071 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:58 crc kubenswrapper[4792]: I0301 09:07:58.503109 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:58 crc kubenswrapper[4792]: I0301 09:07:58.503121 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:59 crc kubenswrapper[4792]: I0301 09:07:59.221603 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:07:59 crc kubenswrapper[4792]: I0301 09:07:59.221894 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:59 crc kubenswrapper[4792]: I0301 09:07:59.223432 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:59 crc kubenswrapper[4792]: I0301 09:07:59.223624 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:59 crc kubenswrapper[4792]: I0301 09:07:59.223831 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:59 crc kubenswrapper[4792]: I0301 09:07:59.348191 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:07:59 crc kubenswrapper[4792]: I0301 09:07:59.348490 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:59 crc kubenswrapper[4792]: I0301 09:07:59.350162 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:59 crc kubenswrapper[4792]: I0301 09:07:59.350569 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:59 crc kubenswrapper[4792]: I0301 09:07:59.350759 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:07:59 crc kubenswrapper[4792]: I0301 09:07:59.504145 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:07:59 crc kubenswrapper[4792]: I0301 09:07:59.505537 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:07:59 crc kubenswrapper[4792]: I0301 09:07:59.505582 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:07:59 crc kubenswrapper[4792]: I0301 09:07:59.505599 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:00 crc kubenswrapper[4792]: I0301 09:08:00.115104 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:08:00 crc kubenswrapper[4792]: I0301 09:08:00.115620 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:00 crc kubenswrapper[4792]: I0301 09:08:00.120406 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:00 crc kubenswrapper[4792]: I0301 09:08:00.120481 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:00 crc kubenswrapper[4792]: I0301 09:08:00.120511 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:00 crc kubenswrapper[4792]: I0301 09:08:00.123058 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:08:00 crc kubenswrapper[4792]: I0301 09:08:00.235981 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:08:00 crc kubenswrapper[4792]: I0301 09:08:00.236544 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:00 crc kubenswrapper[4792]: I0301 09:08:00.238041 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:00 crc kubenswrapper[4792]: I0301 09:08:00.238238 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:00 crc kubenswrapper[4792]: I0301 09:08:00.238431 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:00 crc kubenswrapper[4792]: I0301 09:08:00.507742 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:00 crc kubenswrapper[4792]: I0301 09:08:00.509446 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:00 crc kubenswrapper[4792]: I0301 09:08:00.509494 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:00 crc kubenswrapper[4792]: I0301 09:08:00.509512 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:01 crc kubenswrapper[4792]: I0301 09:08:01.005860 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:08:01 crc kubenswrapper[4792]: E0301 09:08:01.496260 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 01 09:08:01 crc kubenswrapper[4792]: I0301 09:08:01.509835 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:01 crc kubenswrapper[4792]: I0301 09:08:01.511811 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:01 crc kubenswrapper[4792]: I0301 09:08:01.511846 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:01 crc kubenswrapper[4792]: I0301 09:08:01.511859 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:03 crc kubenswrapper[4792]: I0301 09:08:03.906468 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 01 09:08:03 crc kubenswrapper[4792]: I0301 09:08:03.906754 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:03 crc kubenswrapper[4792]: I0301 09:08:03.908159 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:03 crc kubenswrapper[4792]: I0301 09:08:03.908192 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:03 crc kubenswrapper[4792]: I0301 09:08:03.908205 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:05 crc kubenswrapper[4792]: I0301 09:08:05.348027 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 01 09:08:05 crc kubenswrapper[4792]: I0301 09:08:05.524663 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 01 09:08:05 crc kubenswrapper[4792]: I0301 09:08:05.527217 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="27e10bb6a0f100f119755de151dfea900c1268526a3fe0ee0f4f067654d2c3fb" exitCode=255 Mar 01 09:08:05 crc kubenswrapper[4792]: I0301 09:08:05.527257 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"27e10bb6a0f100f119755de151dfea900c1268526a3fe0ee0f4f067654d2c3fb"} Mar 01 09:08:05 crc kubenswrapper[4792]: I0301 09:08:05.527396 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:05 crc kubenswrapper[4792]: I0301 09:08:05.528378 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:05 crc kubenswrapper[4792]: I0301 09:08:05.528469 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:05 crc kubenswrapper[4792]: I0301 09:08:05.528497 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:05 crc kubenswrapper[4792]: I0301 09:08:05.529967 4792 scope.go:117] "RemoveContainer" containerID="27e10bb6a0f100f119755de151dfea900c1268526a3fe0ee0f4f067654d2c3fb" Mar 01 09:08:05 crc kubenswrapper[4792]: W0301 09:08:05.846969 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:05Z is after 2026-02-23T05:33:13Z Mar 01 09:08:05 crc kubenswrapper[4792]: E0301 09:08:05.847100 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:05Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 01 09:08:05 crc kubenswrapper[4792]: E0301 09:08:05.849197 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:05Z is after 2026-02-23T05:33:13Z" node="crc" Mar 01 09:08:05 crc kubenswrapper[4792]: W0301 09:08:05.849885 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:05Z is after 2026-02-23T05:33:13Z Mar 01 09:08:05 crc kubenswrapper[4792]: E0301 09:08:05.849974 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:05Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 01 09:08:05 crc kubenswrapper[4792]: W0301 09:08:05.852175 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:05Z is after 2026-02-23T05:33:13Z Mar 01 09:08:05 crc kubenswrapper[4792]: E0301 09:08:05.852298 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:05Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 01 09:08:05 crc kubenswrapper[4792]: E0301 09:08:05.854817 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:05Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 01 09:08:05 crc kubenswrapper[4792]: I0301 09:08:05.856212 4792 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 01 09:08:05 crc kubenswrapper[4792]: I0301 09:08:05.856293 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 01 09:08:05 crc kubenswrapper[4792]: E0301 09:08:05.856626 4792 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:05Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 01 09:08:05 crc kubenswrapper[4792]: W0301 09:08:05.859284 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:05Z is after 2026-02-23T05:33:13Z Mar 01 09:08:05 crc kubenswrapper[4792]: E0301 09:08:05.859352 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:05Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 01 09:08:05 crc kubenswrapper[4792]: E0301 09:08:05.865331 4792 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:05Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1898ac74dff1661c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.343613468 +0000 UTC m=+0.585492665,LastTimestamp:2026-03-01 09:07:51.343613468 +0000 UTC m=+0.585492665,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:05 crc kubenswrapper[4792]: I0301 09:08:05.866494 4792 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 01 09:08:05 crc kubenswrapper[4792]: I0301 09:08:05.866601 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 01 09:08:06 crc kubenswrapper[4792]: I0301 09:08:06.349832 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:06Z is after 2026-02-23T05:33:13Z Mar 01 09:08:06 crc kubenswrapper[4792]: I0301 09:08:06.533784 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 01 09:08:06 crc kubenswrapper[4792]: I0301 09:08:06.536235 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"688142c3c70b9a317414df3bbaa2620b7d213567bc82305c33beb3b206eb9dbd"} Mar 01 09:08:06 crc kubenswrapper[4792]: I0301 09:08:06.536537 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:06 crc kubenswrapper[4792]: I0301 09:08:06.537892 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:06 crc kubenswrapper[4792]: I0301 09:08:06.537977 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:06 crc kubenswrapper[4792]: I0301 09:08:06.538000 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:07 crc kubenswrapper[4792]: I0301 09:08:07.350149 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:07Z is after 2026-02-23T05:33:13Z Mar 01 09:08:07 crc kubenswrapper[4792]: I0301 09:08:07.543218 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 01 09:08:07 crc kubenswrapper[4792]: I0301 09:08:07.544149 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 01 09:08:07 crc kubenswrapper[4792]: I0301 09:08:07.547982 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="688142c3c70b9a317414df3bbaa2620b7d213567bc82305c33beb3b206eb9dbd" exitCode=255 Mar 01 09:08:07 crc kubenswrapper[4792]: I0301 09:08:07.548081 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"688142c3c70b9a317414df3bbaa2620b7d213567bc82305c33beb3b206eb9dbd"} Mar 01 09:08:07 crc kubenswrapper[4792]: I0301 09:08:07.548194 4792 scope.go:117] "RemoveContainer" containerID="27e10bb6a0f100f119755de151dfea900c1268526a3fe0ee0f4f067654d2c3fb" Mar 01 09:08:07 crc kubenswrapper[4792]: I0301 09:08:07.548447 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:07 crc kubenswrapper[4792]: I0301 09:08:07.550618 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:07 crc kubenswrapper[4792]: I0301 09:08:07.550692 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:07 crc kubenswrapper[4792]: I0301 09:08:07.550721 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:07 crc kubenswrapper[4792]: I0301 09:08:07.553366 4792 scope.go:117] "RemoveContainer" containerID="688142c3c70b9a317414df3bbaa2620b7d213567bc82305c33beb3b206eb9dbd" Mar 01 09:08:07 crc kubenswrapper[4792]: E0301 09:08:07.553892 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:08:08 crc kubenswrapper[4792]: I0301 09:08:08.350274 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:08Z is after 2026-02-23T05:33:13Z Mar 01 09:08:08 crc kubenswrapper[4792]: I0301 09:08:08.500143 4792 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 01 09:08:08 crc kubenswrapper[4792]: I0301 09:08:08.500313 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 01 09:08:08 crc kubenswrapper[4792]: I0301 09:08:08.553121 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 01 09:08:09 crc kubenswrapper[4792]: I0301 09:08:09.350371 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:09Z is after 2026-02-23T05:33:13Z Mar 01 09:08:10 crc kubenswrapper[4792]: I0301 09:08:10.244341 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:08:10 crc kubenswrapper[4792]: I0301 09:08:10.244568 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:10 crc kubenswrapper[4792]: I0301 09:08:10.245973 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:10 crc kubenswrapper[4792]: I0301 09:08:10.246012 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:10 crc kubenswrapper[4792]: I0301 09:08:10.246025 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:10 crc kubenswrapper[4792]: I0301 09:08:10.246664 4792 scope.go:117] "RemoveContainer" containerID="688142c3c70b9a317414df3bbaa2620b7d213567bc82305c33beb3b206eb9dbd" Mar 01 09:08:10 crc kubenswrapper[4792]: E0301 09:08:10.246940 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:08:10 crc kubenswrapper[4792]: I0301 09:08:10.249741 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:08:10 crc kubenswrapper[4792]: I0301 09:08:10.350804 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:10Z is after 2026-02-23T05:33:13Z Mar 01 09:08:10 crc kubenswrapper[4792]: I0301 09:08:10.561000 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:10 crc kubenswrapper[4792]: I0301 09:08:10.561793 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:10 crc kubenswrapper[4792]: I0301 09:08:10.561824 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:10 crc kubenswrapper[4792]: I0301 09:08:10.561836 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:10 crc kubenswrapper[4792]: I0301 09:08:10.562343 4792 scope.go:117] "RemoveContainer" containerID="688142c3c70b9a317414df3bbaa2620b7d213567bc82305c33beb3b206eb9dbd" Mar 01 09:08:10 crc kubenswrapper[4792]: E0301 09:08:10.562525 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:08:11 crc kubenswrapper[4792]: I0301 09:08:11.012657 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:08:11 crc kubenswrapper[4792]: I0301 09:08:11.012836 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:11 crc kubenswrapper[4792]: I0301 09:08:11.014066 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:11 crc kubenswrapper[4792]: I0301 09:08:11.014164 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:11 crc kubenswrapper[4792]: I0301 09:08:11.014187 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:11 crc kubenswrapper[4792]: I0301 09:08:11.018105 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:08:11 crc kubenswrapper[4792]: I0301 09:08:11.350079 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:11Z is after 2026-02-23T05:33:13Z Mar 01 09:08:11 crc kubenswrapper[4792]: E0301 09:08:11.496835 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 01 09:08:11 crc kubenswrapper[4792]: I0301 09:08:11.563821 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:11 crc kubenswrapper[4792]: I0301 09:08:11.565179 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:11 crc kubenswrapper[4792]: I0301 09:08:11.565263 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:11 crc kubenswrapper[4792]: I0301 09:08:11.565315 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:11 crc kubenswrapper[4792]: I0301 09:08:11.566305 4792 scope.go:117] "RemoveContainer" containerID="688142c3c70b9a317414df3bbaa2620b7d213567bc82305c33beb3b206eb9dbd" Mar 01 09:08:11 crc kubenswrapper[4792]: E0301 09:08:11.566642 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:08:12 crc kubenswrapper[4792]: I0301 09:08:12.202572 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:08:12 crc kubenswrapper[4792]: I0301 09:08:12.251122 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:12 crc kubenswrapper[4792]: I0301 09:08:12.253016 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:12 crc kubenswrapper[4792]: I0301 09:08:12.253080 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:12 crc kubenswrapper[4792]: I0301 09:08:12.253097 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:12 crc kubenswrapper[4792]: I0301 09:08:12.253138 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 01 09:08:12 crc kubenswrapper[4792]: E0301 09:08:12.257493 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:12Z is after 2026-02-23T05:33:13Z" node="crc" Mar 01 09:08:12 crc kubenswrapper[4792]: E0301 09:08:12.259598 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:12Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 01 09:08:12 crc kubenswrapper[4792]: I0301 09:08:12.349556 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:12Z is after 2026-02-23T05:33:13Z Mar 01 09:08:12 crc kubenswrapper[4792]: I0301 09:08:12.566387 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:12 crc kubenswrapper[4792]: I0301 09:08:12.567220 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:12 crc kubenswrapper[4792]: I0301 09:08:12.567255 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:12 crc kubenswrapper[4792]: I0301 09:08:12.567267 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:12 crc kubenswrapper[4792]: I0301 09:08:12.568077 4792 scope.go:117] "RemoveContainer" containerID="688142c3c70b9a317414df3bbaa2620b7d213567bc82305c33beb3b206eb9dbd" Mar 01 09:08:12 crc kubenswrapper[4792]: E0301 09:08:12.568744 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:08:13 crc kubenswrapper[4792]: I0301 09:08:13.350137 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:13Z is after 2026-02-23T05:33:13Z Mar 01 09:08:13 crc kubenswrapper[4792]: I0301 09:08:13.931659 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 01 09:08:13 crc kubenswrapper[4792]: I0301 09:08:13.931971 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:13 crc kubenswrapper[4792]: I0301 09:08:13.933688 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:13 crc kubenswrapper[4792]: I0301 09:08:13.933760 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:13 crc kubenswrapper[4792]: I0301 09:08:13.933784 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:13 crc kubenswrapper[4792]: I0301 09:08:13.943854 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 01 09:08:14 crc kubenswrapper[4792]: I0301 09:08:14.080715 4792 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 01 09:08:14 crc kubenswrapper[4792]: E0301 09:08:14.086011 4792 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:14Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 01 09:08:14 crc kubenswrapper[4792]: W0301 09:08:14.088736 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:14Z is after 2026-02-23T05:33:13Z Mar 01 09:08:14 crc kubenswrapper[4792]: E0301 09:08:14.088817 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:14Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 01 09:08:14 crc kubenswrapper[4792]: I0301 09:08:14.349634 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:14Z is after 2026-02-23T05:33:13Z Mar 01 09:08:14 crc kubenswrapper[4792]: I0301 09:08:14.572023 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:14 crc kubenswrapper[4792]: I0301 09:08:14.573465 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:14 crc kubenswrapper[4792]: I0301 09:08:14.573532 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:14 crc kubenswrapper[4792]: I0301 09:08:14.573553 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:15 crc kubenswrapper[4792]: I0301 09:08:15.352143 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:15Z is after 2026-02-23T05:33:13Z Mar 01 09:08:15 crc kubenswrapper[4792]: W0301 09:08:15.357127 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:15Z is after 2026-02-23T05:33:13Z Mar 01 09:08:15 crc kubenswrapper[4792]: E0301 09:08:15.357243 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:15Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 01 09:08:15 crc kubenswrapper[4792]: E0301 09:08:15.871691 4792 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:15Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1898ac74dff1661c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.343613468 +0000 UTC m=+0.585492665,LastTimestamp:2026-03-01 09:07:51.343613468 +0000 UTC m=+0.585492665,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:16 crc kubenswrapper[4792]: I0301 09:08:16.352427 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:16Z is after 2026-02-23T05:33:13Z Mar 01 09:08:16 crc kubenswrapper[4792]: W0301 09:08:16.740083 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:16Z is after 2026-02-23T05:33:13Z Mar 01 09:08:16 crc kubenswrapper[4792]: E0301 09:08:16.740209 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:16Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 01 09:08:17 crc kubenswrapper[4792]: I0301 09:08:17.352081 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:17Z is after 2026-02-23T05:33:13Z Mar 01 09:08:18 crc kubenswrapper[4792]: I0301 09:08:18.351307 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:18Z is after 2026-02-23T05:33:13Z Mar 01 09:08:18 crc kubenswrapper[4792]: I0301 09:08:18.499940 4792 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 01 09:08:18 crc kubenswrapper[4792]: I0301 09:08:18.500032 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 01 09:08:18 crc kubenswrapper[4792]: I0301 09:08:18.500128 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:08:18 crc kubenswrapper[4792]: I0301 09:08:18.500339 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:18 crc kubenswrapper[4792]: I0301 09:08:18.501892 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:18 crc kubenswrapper[4792]: I0301 09:08:18.501979 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:18 crc kubenswrapper[4792]: I0301 09:08:18.501992 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:18 crc kubenswrapper[4792]: I0301 09:08:18.502642 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"684aa0b5aeaecf3fb572ab400e718956a5373c4fa6651e742d65bdbc3425b9b2"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 01 09:08:18 crc kubenswrapper[4792]: I0301 09:08:18.502821 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://684aa0b5aeaecf3fb572ab400e718956a5373c4fa6651e742d65bdbc3425b9b2" gracePeriod=30 Mar 01 09:08:18 crc kubenswrapper[4792]: W0301 09:08:18.544162 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:18Z is after 2026-02-23T05:33:13Z Mar 01 09:08:18 crc kubenswrapper[4792]: E0301 09:08:18.544253 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:18Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 01 09:08:19 crc kubenswrapper[4792]: I0301 09:08:19.258080 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:19 crc kubenswrapper[4792]: I0301 09:08:19.259901 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:19 crc kubenswrapper[4792]: I0301 09:08:19.259958 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:19 crc kubenswrapper[4792]: I0301 09:08:19.259968 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:19 crc kubenswrapper[4792]: I0301 09:08:19.259993 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 01 09:08:19 crc kubenswrapper[4792]: E0301 09:08:19.264944 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:19Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 01 09:08:19 crc kubenswrapper[4792]: E0301 09:08:19.265095 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:19Z is after 2026-02-23T05:33:13Z" node="crc" Mar 01 09:08:19 crc kubenswrapper[4792]: I0301 09:08:19.353062 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:19Z is after 2026-02-23T05:33:13Z Mar 01 09:08:19 crc kubenswrapper[4792]: I0301 09:08:19.589746 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 01 09:08:19 crc kubenswrapper[4792]: I0301 09:08:19.590648 4792 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="684aa0b5aeaecf3fb572ab400e718956a5373c4fa6651e742d65bdbc3425b9b2" exitCode=255 Mar 01 09:08:19 crc kubenswrapper[4792]: I0301 09:08:19.590713 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"684aa0b5aeaecf3fb572ab400e718956a5373c4fa6651e742d65bdbc3425b9b2"} Mar 01 09:08:19 crc kubenswrapper[4792]: I0301 09:08:19.590755 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1e4a35ca4af33a2e93167010c2d2d02e6ef5fa5c4f65b6b87b344b58cc0a3867"} Mar 01 09:08:19 crc kubenswrapper[4792]: I0301 09:08:19.590939 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:19 crc kubenswrapper[4792]: I0301 09:08:19.591877 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:19 crc kubenswrapper[4792]: I0301 09:08:19.591946 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:19 crc kubenswrapper[4792]: I0301 09:08:19.591964 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:20 crc kubenswrapper[4792]: I0301 09:08:20.352343 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:20Z is after 2026-02-23T05:33:13Z Mar 01 09:08:21 crc kubenswrapper[4792]: I0301 09:08:21.354376 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:21Z is after 2026-02-23T05:33:13Z Mar 01 09:08:21 crc kubenswrapper[4792]: E0301 09:08:21.496945 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 01 09:08:22 crc kubenswrapper[4792]: I0301 09:08:22.350610 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:22Z is after 2026-02-23T05:33:13Z Mar 01 09:08:23 crc kubenswrapper[4792]: I0301 09:08:23.352696 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:23Z is after 2026-02-23T05:33:13Z Mar 01 09:08:24 crc kubenswrapper[4792]: I0301 09:08:24.350103 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:24Z is after 2026-02-23T05:33:13Z Mar 01 09:08:24 crc kubenswrapper[4792]: I0301 09:08:24.408198 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:24 crc kubenswrapper[4792]: I0301 09:08:24.409461 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:24 crc kubenswrapper[4792]: I0301 09:08:24.409514 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:24 crc kubenswrapper[4792]: I0301 09:08:24.409532 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:24 crc kubenswrapper[4792]: I0301 09:08:24.410516 4792 scope.go:117] "RemoveContainer" containerID="688142c3c70b9a317414df3bbaa2620b7d213567bc82305c33beb3b206eb9dbd" Mar 01 09:08:25 crc kubenswrapper[4792]: I0301 09:08:25.348962 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:25Z is after 2026-02-23T05:33:13Z Mar 01 09:08:25 crc kubenswrapper[4792]: I0301 09:08:25.498701 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:08:25 crc kubenswrapper[4792]: I0301 09:08:25.498841 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:25 crc kubenswrapper[4792]: I0301 09:08:25.500342 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:25 crc kubenswrapper[4792]: I0301 09:08:25.500395 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:25 crc kubenswrapper[4792]: I0301 09:08:25.500409 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:25 crc kubenswrapper[4792]: I0301 09:08:25.612249 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 01 09:08:25 crc kubenswrapper[4792]: I0301 09:08:25.613082 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 01 09:08:25 crc kubenswrapper[4792]: I0301 09:08:25.615575 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="01dac4fce4f31460a90ca7eaf5d97ffd8254d0b93b55dcb4598bd038633b40d9" exitCode=255 Mar 01 09:08:25 crc kubenswrapper[4792]: I0301 09:08:25.615624 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"01dac4fce4f31460a90ca7eaf5d97ffd8254d0b93b55dcb4598bd038633b40d9"} Mar 01 09:08:25 crc kubenswrapper[4792]: I0301 09:08:25.615670 4792 scope.go:117] "RemoveContainer" containerID="688142c3c70b9a317414df3bbaa2620b7d213567bc82305c33beb3b206eb9dbd" Mar 01 09:08:25 crc kubenswrapper[4792]: I0301 09:08:25.615882 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:25 crc kubenswrapper[4792]: I0301 09:08:25.617305 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:25 crc kubenswrapper[4792]: I0301 09:08:25.617427 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:25 crc kubenswrapper[4792]: I0301 09:08:25.617452 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:25 crc kubenswrapper[4792]: I0301 09:08:25.618597 4792 scope.go:117] "RemoveContainer" containerID="01dac4fce4f31460a90ca7eaf5d97ffd8254d0b93b55dcb4598bd038633b40d9" Mar 01 09:08:25 crc kubenswrapper[4792]: E0301 09:08:25.618969 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:08:25 crc kubenswrapper[4792]: E0301 09:08:25.877455 4792 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:25Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1898ac74dff1661c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.343613468 +0000 UTC m=+0.585492665,LastTimestamp:2026-03-01 09:07:51.343613468 +0000 UTC m=+0.585492665,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:26 crc kubenswrapper[4792]: I0301 09:08:26.266090 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:26 crc kubenswrapper[4792]: I0301 09:08:26.267601 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:26 crc kubenswrapper[4792]: I0301 09:08:26.267643 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:26 crc kubenswrapper[4792]: I0301 09:08:26.267656 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:26 crc kubenswrapper[4792]: I0301 09:08:26.267684 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 01 09:08:26 crc kubenswrapper[4792]: E0301 09:08:26.269510 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:26Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 01 09:08:26 crc kubenswrapper[4792]: E0301 09:08:26.271392 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:26Z is after 2026-02-23T05:33:13Z" node="crc" Mar 01 09:08:26 crc kubenswrapper[4792]: I0301 09:08:26.351795 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:26Z is after 2026-02-23T05:33:13Z Mar 01 09:08:26 crc kubenswrapper[4792]: I0301 09:08:26.622944 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 01 09:08:27 crc kubenswrapper[4792]: I0301 09:08:27.350408 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:27Z is after 2026-02-23T05:33:13Z Mar 01 09:08:28 crc kubenswrapper[4792]: W0301 09:08:28.275208 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:28Z is after 2026-02-23T05:33:13Z Mar 01 09:08:28 crc kubenswrapper[4792]: E0301 09:08:28.275314 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:28Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 01 09:08:28 crc kubenswrapper[4792]: I0301 09:08:28.352350 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:28Z is after 2026-02-23T05:33:13Z Mar 01 09:08:28 crc kubenswrapper[4792]: I0301 09:08:28.498749 4792 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 01 09:08:28 crc kubenswrapper[4792]: I0301 09:08:28.499221 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 01 09:08:29 crc kubenswrapper[4792]: I0301 09:08:29.348535 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:08:29 crc kubenswrapper[4792]: I0301 09:08:29.348820 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:29 crc kubenswrapper[4792]: I0301 09:08:29.350409 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:29 crc kubenswrapper[4792]: I0301 09:08:29.350515 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:29 crc kubenswrapper[4792]: I0301 09:08:29.350584 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:29 crc kubenswrapper[4792]: I0301 09:08:29.352378 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:29Z is after 2026-02-23T05:33:13Z Mar 01 09:08:30 crc kubenswrapper[4792]: I0301 09:08:30.159520 4792 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 01 09:08:30 crc kubenswrapper[4792]: E0301 09:08:30.165475 4792 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 01 09:08:30 crc kubenswrapper[4792]: E0301 09:08:30.166932 4792 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 01 09:08:30 crc kubenswrapper[4792]: I0301 09:08:30.351017 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:30Z is after 2026-02-23T05:33:13Z Mar 01 09:08:30 crc kubenswrapper[4792]: W0301 09:08:30.536146 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:30Z is after 2026-02-23T05:33:13Z Mar 01 09:08:30 crc kubenswrapper[4792]: E0301 09:08:30.536403 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 01 09:08:31 crc kubenswrapper[4792]: I0301 09:08:31.017380 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:08:31 crc kubenswrapper[4792]: I0301 09:08:31.018041 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:31 crc kubenswrapper[4792]: I0301 09:08:31.019841 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:31 crc kubenswrapper[4792]: I0301 09:08:31.019883 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:31 crc kubenswrapper[4792]: I0301 09:08:31.019902 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:31 crc kubenswrapper[4792]: I0301 09:08:31.020660 4792 scope.go:117] "RemoveContainer" containerID="01dac4fce4f31460a90ca7eaf5d97ffd8254d0b93b55dcb4598bd038633b40d9" Mar 01 09:08:31 crc kubenswrapper[4792]: E0301 09:08:31.020856 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:08:31 crc kubenswrapper[4792]: I0301 09:08:31.350524 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:31Z is after 2026-02-23T05:33:13Z Mar 01 09:08:31 crc kubenswrapper[4792]: E0301 09:08:31.497263 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 01 09:08:31 crc kubenswrapper[4792]: W0301 09:08:31.669049 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:31Z is after 2026-02-23T05:33:13Z Mar 01 09:08:31 crc kubenswrapper[4792]: E0301 09:08:31.669176 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:31Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 01 09:08:32 crc kubenswrapper[4792]: I0301 09:08:32.202884 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:08:32 crc kubenswrapper[4792]: I0301 09:08:32.203306 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:32 crc kubenswrapper[4792]: I0301 09:08:32.205180 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:32 crc kubenswrapper[4792]: I0301 09:08:32.205252 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:32 crc kubenswrapper[4792]: I0301 09:08:32.205272 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:32 crc kubenswrapper[4792]: I0301 09:08:32.206279 4792 scope.go:117] "RemoveContainer" containerID="01dac4fce4f31460a90ca7eaf5d97ffd8254d0b93b55dcb4598bd038633b40d9" Mar 01 09:08:32 crc kubenswrapper[4792]: E0301 09:08:32.206635 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:08:32 crc kubenswrapper[4792]: I0301 09:08:32.352977 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:32Z is after 2026-02-23T05:33:13Z Mar 01 09:08:33 crc kubenswrapper[4792]: I0301 09:08:33.272794 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:33 crc kubenswrapper[4792]: I0301 09:08:33.274753 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:33 crc kubenswrapper[4792]: I0301 09:08:33.275042 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:33 crc kubenswrapper[4792]: I0301 09:08:33.275237 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:33 crc kubenswrapper[4792]: I0301 09:08:33.275478 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 01 09:08:33 crc kubenswrapper[4792]: E0301 09:08:33.275792 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:33Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 01 09:08:33 crc kubenswrapper[4792]: E0301 09:08:33.278968 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:33Z is after 2026-02-23T05:33:13Z" node="crc" Mar 01 09:08:33 crc kubenswrapper[4792]: I0301 09:08:33.352539 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:33Z is after 2026-02-23T05:33:13Z Mar 01 09:08:34 crc kubenswrapper[4792]: I0301 09:08:34.468938 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:34Z is after 2026-02-23T05:33:13Z Mar 01 09:08:35 crc kubenswrapper[4792]: I0301 09:08:35.350240 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:35Z is after 2026-02-23T05:33:13Z Mar 01 09:08:35 crc kubenswrapper[4792]: E0301 09:08:35.881858 4792 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:35Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1898ac74dff1661c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.343613468 +0000 UTC m=+0.585492665,LastTimestamp:2026-03-01 09:07:51.343613468 +0000 UTC m=+0.585492665,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:36 crc kubenswrapper[4792]: I0301 09:08:36.352125 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:36Z is after 2026-02-23T05:33:13Z Mar 01 09:08:37 crc kubenswrapper[4792]: I0301 09:08:37.352587 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:37Z is after 2026-02-23T05:33:13Z Mar 01 09:08:38 crc kubenswrapper[4792]: I0301 09:08:38.352195 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:08:38Z is after 2026-02-23T05:33:13Z Mar 01 09:08:38 crc kubenswrapper[4792]: I0301 09:08:38.500137 4792 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 01 09:08:38 crc kubenswrapper[4792]: I0301 09:08:38.500237 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 01 09:08:39 crc kubenswrapper[4792]: I0301 09:08:39.351194 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:39 crc kubenswrapper[4792]: W0301 09:08:39.415716 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:39 crc kubenswrapper[4792]: E0301 09:08:39.415776 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 01 09:08:40 crc kubenswrapper[4792]: I0301 09:08:40.279534 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:40 crc kubenswrapper[4792]: I0301 09:08:40.281032 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:40 crc kubenswrapper[4792]: I0301 09:08:40.281083 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:40 crc kubenswrapper[4792]: I0301 09:08:40.281096 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:40 crc kubenswrapper[4792]: I0301 09:08:40.281128 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 01 09:08:40 crc kubenswrapper[4792]: E0301 09:08:40.286200 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 01 09:08:40 crc kubenswrapper[4792]: E0301 09:08:40.286294 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 01 09:08:40 crc kubenswrapper[4792]: I0301 09:08:40.348106 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:41 crc kubenswrapper[4792]: I0301 09:08:41.353495 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:41 crc kubenswrapper[4792]: E0301 09:08:41.497574 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 01 09:08:42 crc kubenswrapper[4792]: I0301 09:08:42.350846 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:42 crc kubenswrapper[4792]: I0301 09:08:42.412316 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 01 09:08:42 crc kubenswrapper[4792]: I0301 09:08:42.412465 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:42 crc kubenswrapper[4792]: I0301 09:08:42.413674 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:42 crc kubenswrapper[4792]: I0301 09:08:42.413727 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:42 crc kubenswrapper[4792]: I0301 09:08:42.413746 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:43 crc kubenswrapper[4792]: I0301 09:08:43.351233 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:44 crc kubenswrapper[4792]: I0301 09:08:44.352184 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:45 crc kubenswrapper[4792]: I0301 09:08:45.352595 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:45 crc kubenswrapper[4792]: I0301 09:08:45.407834 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:45 crc kubenswrapper[4792]: I0301 09:08:45.409133 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:45 crc kubenswrapper[4792]: I0301 09:08:45.409175 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:45 crc kubenswrapper[4792]: I0301 09:08:45.409189 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:45 crc kubenswrapper[4792]: I0301 09:08:45.409815 4792 scope.go:117] "RemoveContainer" containerID="01dac4fce4f31460a90ca7eaf5d97ffd8254d0b93b55dcb4598bd038633b40d9" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.889459 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74dff1661c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.343613468 +0000 UTC m=+0.585492665,LastTimestamp:2026-03-01 09:07:51.343613468 +0000 UTC m=+0.585492665,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.895051 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e82468 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393338472 +0000 UTC m=+0.635217669,LastTimestamp:2026-03-01 09:07:51.393338472 +0000 UTC m=+0.635217669,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.901029 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e8c36e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393379182 +0000 UTC m=+0.635258379,LastTimestamp:2026-03-01 09:07:51.393379182 +0000 UTC m=+0.635258379,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.909871 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e8f137 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393390903 +0000 UTC m=+0.635270100,LastTimestamp:2026-03-01 09:07:51.393390903 +0000 UTC m=+0.635270100,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.915427 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e99c9cf7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.505829111 +0000 UTC m=+0.747708308,LastTimestamp:2026-03-01 09:07:51.505829111 +0000 UTC m=+0.747708308,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.921271 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e82468\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e82468 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393338472 +0000 UTC m=+0.635217669,LastTimestamp:2026-03-01 09:07:51.509044676 +0000 UTC m=+0.750923873,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.926622 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e8c36e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e8c36e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393379182 +0000 UTC m=+0.635258379,LastTimestamp:2026-03-01 09:07:51.509067566 +0000 UTC m=+0.750946763,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.932268 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e8f137\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e8f137 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393390903 +0000 UTC m=+0.635270100,LastTimestamp:2026-03-01 09:07:51.509077396 +0000 UTC m=+0.750956593,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.938686 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e82468\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e82468 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393338472 +0000 UTC m=+0.635217669,LastTimestamp:2026-03-01 09:07:51.510543211 +0000 UTC m=+0.752422408,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.946690 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e8c36e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e8c36e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393379182 +0000 UTC m=+0.635258379,LastTimestamp:2026-03-01 09:07:51.510564522 +0000 UTC m=+0.752443719,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.950304 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e8f137\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e8f137 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393390903 +0000 UTC m=+0.635270100,LastTimestamp:2026-03-01 09:07:51.510573612 +0000 UTC m=+0.752452809,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.953658 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e82468\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e82468 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393338472 +0000 UTC m=+0.635217669,LastTimestamp:2026-03-01 09:07:51.511087651 +0000 UTC m=+0.752966848,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.956853 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e8c36e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e8c36e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393379182 +0000 UTC m=+0.635258379,LastTimestamp:2026-03-01 09:07:51.511123951 +0000 UTC m=+0.753003148,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.960244 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e8f137\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e8f137 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393390903 +0000 UTC m=+0.635270100,LastTimestamp:2026-03-01 09:07:51.511139271 +0000 UTC m=+0.753018458,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.963367 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e82468\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e82468 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393338472 +0000 UTC m=+0.635217669,LastTimestamp:2026-03-01 09:07:51.51223718 +0000 UTC m=+0.754116377,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.966803 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e8c36e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e8c36e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393379182 +0000 UTC m=+0.635258379,LastTimestamp:2026-03-01 09:07:51.512309911 +0000 UTC m=+0.754189108,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.970622 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e8f137\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e8f137 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393390903 +0000 UTC m=+0.635270100,LastTimestamp:2026-03-01 09:07:51.512317931 +0000 UTC m=+0.754197128,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.974534 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e82468\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e82468 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393338472 +0000 UTC m=+0.635217669,LastTimestamp:2026-03-01 09:07:51.512383053 +0000 UTC m=+0.754262250,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.978518 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e8c36e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e8c36e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393379182 +0000 UTC m=+0.635258379,LastTimestamp:2026-03-01 09:07:51.512400153 +0000 UTC m=+0.754279350,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.982493 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e8f137\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e8f137 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393390903 +0000 UTC m=+0.635270100,LastTimestamp:2026-03-01 09:07:51.512426683 +0000 UTC m=+0.754305880,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.986427 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e82468\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e82468 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393338472 +0000 UTC m=+0.635217669,LastTimestamp:2026-03-01 09:07:51.514226754 +0000 UTC m=+0.756105951,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.989576 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e8c36e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e8c36e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393379182 +0000 UTC m=+0.635258379,LastTimestamp:2026-03-01 09:07:51.514242644 +0000 UTC m=+0.756121841,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.992805 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e8f137\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e8f137 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393390903 +0000 UTC m=+0.635270100,LastTimestamp:2026-03-01 09:07:51.514254644 +0000 UTC m=+0.756133841,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.995848 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e82468\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e82468 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393338472 +0000 UTC m=+0.635217669,LastTimestamp:2026-03-01 09:07:51.515412044 +0000 UTC m=+0.757291241,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:45 crc kubenswrapper[4792]: E0301 09:08:45.999011 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1898ac74e2e8c36e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1898ac74e2e8c36e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.393379182 +0000 UTC m=+0.635258379,LastTimestamp:2026-03-01 09:07:51.515429814 +0000 UTC m=+0.757309011,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.004969 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1898ac75033b4de0 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.935659488 +0000 UTC m=+1.177538685,LastTimestamp:2026-03-01 09:07:51.935659488 +0000 UTC m=+1.177538685,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.009072 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1898ac750342c65e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.936149086 +0000 UTC m=+1.178028283,LastTimestamp:2026-03-01 09:07:51.936149086 +0000 UTC m=+1.178028283,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.015365 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac7503b38ed0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.943540432 +0000 UTC m=+1.185419639,LastTimestamp:2026-03-01 09:07:51.943540432 +0000 UTC m=+1.185419639,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.018846 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac7504c1862e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.961232942 +0000 UTC m=+1.203112149,LastTimestamp:2026-03-01 09:07:51.961232942 +0000 UTC m=+1.203112149,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.022101 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac7505c6f253 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:51.978365523 +0000 UTC m=+1.220244730,LastTimestamp:2026-03-01 09:07:51.978365523 +0000 UTC m=+1.220244730,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.025377 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1898ac75286b6ad3 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:52.559569619 +0000 UTC m=+1.801448816,LastTimestamp:2026-03-01 09:07:52.559569619 +0000 UTC m=+1.801448816,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.028262 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1898ac7528c5651f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:52.565466399 +0000 UTC m=+1.807345596,LastTimestamp:2026-03-01 09:07:52.565466399 +0000 UTC m=+1.807345596,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.031183 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac7528ccf64e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:52.565962318 +0000 UTC m=+1.807841515,LastTimestamp:2026-03-01 09:07:52.565962318 +0000 UTC m=+1.807841515,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.034169 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac7528cd916d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:52.566002029 +0000 UTC m=+1.807881226,LastTimestamp:2026-03-01 09:07:52.566002029 +0000 UTC m=+1.807881226,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.037230 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac7528d8a72b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:52.566728491 +0000 UTC m=+1.808607688,LastTimestamp:2026-03-01 09:07:52.566728491 +0000 UTC m=+1.808607688,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.040274 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1898ac752907a95b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:52.569809243 +0000 UTC m=+1.811688440,LastTimestamp:2026-03-01 09:07:52.569809243 +0000 UTC m=+1.811688440,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.043346 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac7529989bd8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:52.579308504 +0000 UTC m=+1.821187711,LastTimestamp:2026-03-01 09:07:52.579308504 +0000 UTC m=+1.821187711,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.046431 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac7529ada02c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:52.580685868 +0000 UTC m=+1.822565065,LastTimestamp:2026-03-01 09:07:52.580685868 +0000 UTC m=+1.822565065,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.050287 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1898ac752a155fff openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:52.587485183 +0000 UTC m=+1.829364400,LastTimestamp:2026-03-01 09:07:52.587485183 +0000 UTC m=+1.829364400,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.053312 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac752a4a8530 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:52.590968112 +0000 UTC m=+1.832847299,LastTimestamp:2026-03-01 09:07:52.590968112 +0000 UTC m=+1.832847299,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.056190 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac752a631bde openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:52.59257955 +0000 UTC m=+1.834458747,LastTimestamp:2026-03-01 09:07:52.59257955 +0000 UTC m=+1.834458747,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.059073 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac753c605ce1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:52.894389473 +0000 UTC m=+2.136268690,LastTimestamp:2026-03-01 09:07:52.894389473 +0000 UTC m=+2.136268690,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: I0301 09:08:46.733452 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.733770 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac753d405016 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:52.909066262 +0000 UTC m=+2.150945499,LastTimestamp:2026-03-01 09:07:52.909066262 +0000 UTC m=+2.150945499,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.741997 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac753d5b7879 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:52.910846073 +0000 UTC m=+2.152725310,LastTimestamp:2026-03-01 09:07:52.910846073 +0000 UTC m=+2.152725310,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.747460 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac754a119981 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.124108673 +0000 UTC m=+2.365987890,LastTimestamp:2026-03-01 09:07:53.124108673 +0000 UTC m=+2.365987890,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.755352 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac754af7b237 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.139188279 +0000 UTC m=+2.381067516,LastTimestamp:2026-03-01 09:07:53.139188279 +0000 UTC m=+2.381067516,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.764111 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac754b0f4d17 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.140735255 +0000 UTC m=+2.382614472,LastTimestamp:2026-03-01 09:07:53.140735255 +0000 UTC m=+2.382614472,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.771601 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac7556abcc05 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.335540741 +0000 UTC m=+2.577419938,LastTimestamp:2026-03-01 09:07:53.335540741 +0000 UTC m=+2.577419938,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.779521 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac7557731cba openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.348603066 +0000 UTC m=+2.590482283,LastTimestamp:2026-03-01 09:07:53.348603066 +0000 UTC m=+2.590482283,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.784532 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac755d3f8e6a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.445887594 +0000 UTC m=+2.687766831,LastTimestamp:2026-03-01 09:07:53.445887594 +0000 UTC m=+2.687766831,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.794326 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac755ddc85a9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.456174505 +0000 UTC m=+2.698053732,LastTimestamp:2026-03-01 09:07:53.456174505 +0000 UTC m=+2.698053732,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.799516 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1898ac755e3aa9da openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.462344154 +0000 UTC m=+2.704223371,LastTimestamp:2026-03-01 09:07:53.462344154 +0000 UTC m=+2.704223371,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.807431 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1898ac755e53a0b3 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.463980211 +0000 UTC m=+2.705859448,LastTimestamp:2026-03-01 09:07:53.463980211 +0000 UTC m=+2.705859448,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.813116 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac756af9f510 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.676207376 +0000 UTC m=+2.918086573,LastTimestamp:2026-03-01 09:07:53.676207376 +0000 UTC m=+2.918086573,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.818267 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac756b22b90a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.678878986 +0000 UTC m=+2.920758183,LastTimestamp:2026-03-01 09:07:53.678878986 +0000 UTC m=+2.920758183,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.822730 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1898ac756bc4c6fa openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.689499386 +0000 UTC m=+2.931378583,LastTimestamp:2026-03-01 09:07:53.689499386 +0000 UTC m=+2.931378583,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.827166 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac756c095ba0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.693993888 +0000 UTC m=+2.935873085,LastTimestamp:2026-03-01 09:07:53.693993888 +0000 UTC m=+2.935873085,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.831973 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac756c24942e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.695777838 +0000 UTC m=+2.937657035,LastTimestamp:2026-03-01 09:07:53.695777838 +0000 UTC m=+2.937657035,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.836821 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1898ac756c7abc08 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.701424136 +0000 UTC m=+2.943303353,LastTimestamp:2026-03-01 09:07:53.701424136 +0000 UTC m=+2.943303353,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.839005 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac756c9742c6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.703293638 +0000 UTC m=+2.945172835,LastTimestamp:2026-03-01 09:07:53.703293638 +0000 UTC m=+2.945172835,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.843942 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1898ac756de494e7 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.725138151 +0000 UTC m=+2.967017348,LastTimestamp:2026-03-01 09:07:53.725138151 +0000 UTC m=+2.967017348,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.848074 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1898ac756df25b78 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.726040952 +0000 UTC m=+2.967920149,LastTimestamp:2026-03-01 09:07:53.726040952 +0000 UTC m=+2.967920149,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.852431 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1898ac756df595d1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.726252497 +0000 UTC m=+2.968131694,LastTimestamp:2026-03-01 09:07:53.726252497 +0000 UTC m=+2.968131694,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.856924 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac7578237b11 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.897032465 +0000 UTC m=+3.138911672,LastTimestamp:2026-03-01 09:07:53.897032465 +0000 UTC m=+3.138911672,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.861436 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1898ac757970263d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.918834237 +0000 UTC m=+3.160713444,LastTimestamp:2026-03-01 09:07:53.918834237 +0000 UTC m=+3.160713444,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.865557 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac757973a831 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.919064113 +0000 UTC m=+3.160943320,LastTimestamp:2026-03-01 09:07:53.919064113 +0000 UTC m=+3.160943320,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.870677 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac757990b82a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.920968746 +0000 UTC m=+3.162847943,LastTimestamp:2026-03-01 09:07:53.920968746 +0000 UTC m=+3.162847943,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.875560 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1898ac757a632257 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.934758487 +0000 UTC m=+3.176637684,LastTimestamp:2026-03-01 09:07:53.934758487 +0000 UTC m=+3.176637684,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.879716 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1898ac757a9fdf17 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:53.938738967 +0000 UTC m=+3.180618154,LastTimestamp:2026-03-01 09:07:53.938738967 +0000 UTC m=+3.180618154,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.884297 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac75868facc5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:54.139004101 +0000 UTC m=+3.380883298,LastTimestamp:2026-03-01 09:07:54.139004101 +0000 UTC m=+3.380883298,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.888274 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac758737afa0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:54.15001488 +0000 UTC m=+3.391894077,LastTimestamp:2026-03-01 09:07:54.15001488 +0000 UTC m=+3.391894077,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.891990 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac75874631c6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:54.150965702 +0000 UTC m=+3.392844899,LastTimestamp:2026-03-01 09:07:54.150965702 +0000 UTC m=+3.392844899,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.899161 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1898ac758888fa80 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:54.17211968 +0000 UTC m=+3.413998877,LastTimestamp:2026-03-01 09:07:54.17211968 +0000 UTC m=+3.413998877,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.903087 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1898ac75899bc1ec openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:54.190127596 +0000 UTC m=+3.432006783,LastTimestamp:2026-03-01 09:07:54.190127596 +0000 UTC m=+3.432006783,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.909100 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac75930bfd4e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:54.348477774 +0000 UTC m=+3.590356981,LastTimestamp:2026-03-01 09:07:54.348477774 +0000 UTC m=+3.590356981,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.912689 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac7593be3ea4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:54.360159908 +0000 UTC m=+3.602039105,LastTimestamp:2026-03-01 09:07:54.360159908 +0000 UTC m=+3.602039105,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.918467 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac7593ceb0a8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:54.361237672 +0000 UTC m=+3.603116869,LastTimestamp:2026-03-01 09:07:54.361237672 +0000 UTC m=+3.603116869,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.922737 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac759ae4b79e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:54.480121758 +0000 UTC m=+3.722000945,LastTimestamp:2026-03-01 09:07:54.480121758 +0000 UTC m=+3.722000945,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.926189 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac759e6b1cba openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:54.539261114 +0000 UTC m=+3.781140311,LastTimestamp:2026-03-01 09:07:54.539261114 +0000 UTC m=+3.781140311,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.929538 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac759ef2cbb1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:54.548153265 +0000 UTC m=+3.790032462,LastTimestamp:2026-03-01 09:07:54.548153265 +0000 UTC m=+3.790032462,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.933058 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac75a689e7ee openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:54.675496942 +0000 UTC m=+3.917376139,LastTimestamp:2026-03-01 09:07:54.675496942 +0000 UTC m=+3.917376139,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.938417 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac75a72ef6ac openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:54.686314156 +0000 UTC m=+3.928193343,LastTimestamp:2026-03-01 09:07:54.686314156 +0000 UTC m=+3.928193343,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.943745 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac75d6ef9f4a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:55.487469386 +0000 UTC m=+4.729348603,LastTimestamp:2026-03-01 09:07:55.487469386 +0000 UTC m=+4.729348603,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.948777 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac75e373e090 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:55.69746344 +0000 UTC m=+4.939342637,LastTimestamp:2026-03-01 09:07:55.69746344 +0000 UTC m=+4.939342637,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.953837 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac75e416984f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:55.708127311 +0000 UTC m=+4.950006508,LastTimestamp:2026-03-01 09:07:55.708127311 +0000 UTC m=+4.950006508,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.958791 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac75e42c334b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:55.709543243 +0000 UTC m=+4.951422470,LastTimestamp:2026-03-01 09:07:55.709543243 +0000 UTC m=+4.951422470,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.963005 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac75f2d88a8a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:55.955718794 +0000 UTC m=+5.197597991,LastTimestamp:2026-03-01 09:07:55.955718794 +0000 UTC m=+5.197597991,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.966305 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac75f3b4457f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:55.970119039 +0000 UTC m=+5.211998236,LastTimestamp:2026-03-01 09:07:55.970119039 +0000 UTC m=+5.211998236,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.970164 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac75f3cbf698 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:55.971671704 +0000 UTC m=+5.213550901,LastTimestamp:2026-03-01 09:07:55.971671704 +0000 UTC m=+5.213550901,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.973192 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac76037f2b29 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:56.235074345 +0000 UTC m=+5.476953572,LastTimestamp:2026-03-01 09:07:56.235074345 +0000 UTC m=+5.476953572,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.976014 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac76048c3c5e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:56.252707934 +0000 UTC m=+5.494587161,LastTimestamp:2026-03-01 09:07:56.252707934 +0000 UTC m=+5.494587161,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.981705 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac7604a64dd6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:56.254416342 +0000 UTC m=+5.496295569,LastTimestamp:2026-03-01 09:07:56.254416342 +0000 UTC m=+5.496295569,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.987577 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac7612cfdb6e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:56.49202059 +0000 UTC m=+5.733899777,LastTimestamp:2026-03-01 09:07:56.49202059 +0000 UTC m=+5.733899777,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.993101 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac761366ee4a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:56.501921354 +0000 UTC m=+5.743800551,LastTimestamp:2026-03-01 09:07:56.501921354 +0000 UTC m=+5.743800551,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:46 crc kubenswrapper[4792]: E0301 09:08:46.999045 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac7613825da6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:56.503719334 +0000 UTC m=+5.745598531,LastTimestamp:2026-03-01 09:07:56.503719334 +0000 UTC m=+5.745598531,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.003162 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac761ed176b7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:56.693452471 +0000 UTC m=+5.935331668,LastTimestamp:2026-03-01 09:07:56.693452471 +0000 UTC m=+5.935331668,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.006851 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1898ac761f7403e1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:56.704105441 +0000 UTC m=+5.945984638,LastTimestamp:2026-03-01 09:07:56.704105441 +0000 UTC m=+5.945984638,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.011140 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 01 09:08:47 crc kubenswrapper[4792]: &Event{ObjectMeta:{kube-controller-manager-crc.1898ac768a797c5a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 01 09:08:47 crc kubenswrapper[4792]: body: Mar 01 09:08:47 crc kubenswrapper[4792]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:58.499626074 +0000 UTC m=+7.741505301,LastTimestamp:2026-03-01 09:07:58.499626074 +0000 UTC m=+7.741505301,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 01 09:08:47 crc kubenswrapper[4792]: > Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.014408 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac768a7ad0e7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:58.499713255 +0000 UTC m=+7.741592492,LastTimestamp:2026-03-01 09:07:58.499713255 +0000 UTC m=+7.741592492,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.019289 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1898ac7593ceb0a8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac7593ceb0a8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:54.361237672 +0000 UTC m=+3.603116869,LastTimestamp:2026-03-01 09:08:05.531429824 +0000 UTC m=+14.773309021,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.023206 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1898ac759e6b1cba\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac759e6b1cba openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:54.539261114 +0000 UTC m=+3.781140311,LastTimestamp:2026-03-01 09:08:05.784587199 +0000 UTC m=+15.026466406,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.026484 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1898ac759ef2cbb1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac759ef2cbb1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:54.548153265 +0000 UTC m=+3.790032462,LastTimestamp:2026-03-01 09:08:05.802115155 +0000 UTC m=+15.043994352,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.030025 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 01 09:08:47 crc kubenswrapper[4792]: &Event{ObjectMeta:{kube-apiserver-crc.1898ac7840f70a28 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 01 09:08:47 crc kubenswrapper[4792]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 01 09:08:47 crc kubenswrapper[4792]: Mar 01 09:08:47 crc kubenswrapper[4792]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:08:05.856274984 +0000 UTC m=+15.098154181,LastTimestamp:2026-03-01 09:08:05.856274984 +0000 UTC m=+15.098154181,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 01 09:08:47 crc kubenswrapper[4792]: > Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.033216 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac7840f7bf24 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:08:05.856321316 +0000 UTC m=+15.098200513,LastTimestamp:2026-03-01 09:08:05.856321316 +0000 UTC m=+15.098200513,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.037182 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1898ac7840f70a28\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 01 09:08:47 crc kubenswrapper[4792]: &Event{ObjectMeta:{kube-apiserver-crc.1898ac7840f70a28 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 01 09:08:47 crc kubenswrapper[4792]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 01 09:08:47 crc kubenswrapper[4792]: Mar 01 09:08:47 crc kubenswrapper[4792]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:08:05.856274984 +0000 UTC m=+15.098154181,LastTimestamp:2026-03-01 09:08:05.866560436 +0000 UTC m=+15.108439643,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 01 09:08:47 crc kubenswrapper[4792]: > Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.042231 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1898ac7840f7bf24\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1898ac7840f7bf24 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:08:05.856321316 +0000 UTC m=+15.098200513,LastTimestamp:2026-03-01 09:08:05.866671319 +0000 UTC m=+15.108550526,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.046146 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 01 09:08:47 crc kubenswrapper[4792]: &Event{ObjectMeta:{kube-controller-manager-crc.1898ac78de8f7303 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 01 09:08:47 crc kubenswrapper[4792]: body: Mar 01 09:08:47 crc kubenswrapper[4792]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:08:08.500286211 +0000 UTC m=+17.742165418,LastTimestamp:2026-03-01 09:08:08.500286211 +0000 UTC m=+17.742165418,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 01 09:08:47 crc kubenswrapper[4792]: > Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.049374 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac78de9075f7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:08:08.500352503 +0000 UTC m=+17.742231700,LastTimestamp:2026-03-01 09:08:08.500352503 +0000 UTC m=+17.742231700,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.054351 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1898ac78de8f7303\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 01 09:08:47 crc kubenswrapper[4792]: &Event{ObjectMeta:{kube-controller-manager-crc.1898ac78de8f7303 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 01 09:08:47 crc kubenswrapper[4792]: body: Mar 01 09:08:47 crc kubenswrapper[4792]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:08:08.500286211 +0000 UTC m=+17.742165418,LastTimestamp:2026-03-01 09:08:18.500007446 +0000 UTC m=+27.741886643,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 01 09:08:47 crc kubenswrapper[4792]: > Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.057491 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1898ac78de9075f7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac78de9075f7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:08:08.500352503 +0000 UTC m=+17.742231700,LastTimestamp:2026-03-01 09:08:18.500073558 +0000 UTC m=+27.741952745,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.061382 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac7b32c1bd0a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:08:18.502802698 +0000 UTC m=+27.744681895,LastTimestamp:2026-03-01 09:08:18.502802698 +0000 UTC m=+27.744681895,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.064692 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1898ac752a631bde\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac752a631bde openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:52.59257955 +0000 UTC m=+1.834458747,LastTimestamp:2026-03-01 09:08:18.630126718 +0000 UTC m=+27.872005915,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.068159 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1898ac753c605ce1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac753c605ce1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:52.894389473 +0000 UTC m=+2.136268690,LastTimestamp:2026-03-01 09:08:18.829823663 +0000 UTC m=+28.071702870,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.071815 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1898ac753d405016\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac753d405016 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:07:52.909066262 +0000 UTC m=+2.150945499,LastTimestamp:2026-03-01 09:08:18.840047623 +0000 UTC m=+28.081926820,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.076232 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1898ac78de8f7303\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 01 09:08:47 crc kubenswrapper[4792]: &Event{ObjectMeta:{kube-controller-manager-crc.1898ac78de8f7303 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 01 09:08:47 crc kubenswrapper[4792]: body: Mar 01 09:08:47 crc kubenswrapper[4792]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:08:08.500286211 +0000 UTC m=+17.742165418,LastTimestamp:2026-03-01 09:08:28.499180456 +0000 UTC m=+37.741059713,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 01 09:08:47 crc kubenswrapper[4792]: > Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.080038 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1898ac78de9075f7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1898ac78de9075f7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:08:08.500352503 +0000 UTC m=+17.742231700,LastTimestamp:2026-03-01 09:08:28.499392611 +0000 UTC m=+37.741271868,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.085387 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1898ac78de8f7303\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 01 09:08:47 crc kubenswrapper[4792]: &Event{ObjectMeta:{kube-controller-manager-crc.1898ac78de8f7303 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 01 09:08:47 crc kubenswrapper[4792]: body: Mar 01 09:08:47 crc kubenswrapper[4792]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:08:08.500286211 +0000 UTC m=+17.742165418,LastTimestamp:2026-03-01 09:08:38.500206703 +0000 UTC m=+47.742085930,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 01 09:08:47 crc kubenswrapper[4792]: > Mar 01 09:08:47 crc kubenswrapper[4792]: I0301 09:08:47.287246 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:47 crc kubenswrapper[4792]: I0301 09:08:47.288469 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:47 crc kubenswrapper[4792]: I0301 09:08:47.288497 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:47 crc kubenswrapper[4792]: I0301 09:08:47.288522 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:47 crc kubenswrapper[4792]: I0301 09:08:47.288546 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.293483 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.293826 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 01 09:08:47 crc kubenswrapper[4792]: I0301 09:08:47.348094 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:47 crc kubenswrapper[4792]: I0301 09:08:47.747745 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 01 09:08:47 crc kubenswrapper[4792]: I0301 09:08:47.749205 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 01 09:08:47 crc kubenswrapper[4792]: I0301 09:08:47.752417 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="63c6c961bf7ee7b8486397f8898f6cd31605a2d02f3c3b3627c7c906e42eb44c" exitCode=255 Mar 01 09:08:47 crc kubenswrapper[4792]: I0301 09:08:47.752458 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"63c6c961bf7ee7b8486397f8898f6cd31605a2d02f3c3b3627c7c906e42eb44c"} Mar 01 09:08:47 crc kubenswrapper[4792]: I0301 09:08:47.752537 4792 scope.go:117] "RemoveContainer" containerID="01dac4fce4f31460a90ca7eaf5d97ffd8254d0b93b55dcb4598bd038633b40d9" Mar 01 09:08:47 crc kubenswrapper[4792]: I0301 09:08:47.752718 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:47 crc kubenswrapper[4792]: I0301 09:08:47.754287 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:47 crc kubenswrapper[4792]: I0301 09:08:47.754417 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:47 crc kubenswrapper[4792]: I0301 09:08:47.754444 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:47 crc kubenswrapper[4792]: I0301 09:08:47.755344 4792 scope.go:117] "RemoveContainer" containerID="63c6c961bf7ee7b8486397f8898f6cd31605a2d02f3c3b3627c7c906e42eb44c" Mar 01 09:08:47 crc kubenswrapper[4792]: E0301 09:08:47.755620 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:08:48 crc kubenswrapper[4792]: I0301 09:08:48.351937 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:48 crc kubenswrapper[4792]: I0301 09:08:48.500053 4792 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 01 09:08:48 crc kubenswrapper[4792]: I0301 09:08:48.500169 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 01 09:08:48 crc kubenswrapper[4792]: I0301 09:08:48.500247 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:08:48 crc kubenswrapper[4792]: I0301 09:08:48.500436 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:48 crc kubenswrapper[4792]: I0301 09:08:48.501993 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:48 crc kubenswrapper[4792]: I0301 09:08:48.502030 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:48 crc kubenswrapper[4792]: I0301 09:08:48.502048 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:48 crc kubenswrapper[4792]: I0301 09:08:48.502686 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"1e4a35ca4af33a2e93167010c2d2d02e6ef5fa5c4f65b6b87b344b58cc0a3867"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 01 09:08:48 crc kubenswrapper[4792]: I0301 09:08:48.502880 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://1e4a35ca4af33a2e93167010c2d2d02e6ef5fa5c4f65b6b87b344b58cc0a3867" gracePeriod=30 Mar 01 09:08:48 crc kubenswrapper[4792]: I0301 09:08:48.756757 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 01 09:08:48 crc kubenswrapper[4792]: I0301 09:08:48.760661 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 01 09:08:48 crc kubenswrapper[4792]: I0301 09:08:48.761868 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 01 09:08:48 crc kubenswrapper[4792]: I0301 09:08:48.762448 4792 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="1e4a35ca4af33a2e93167010c2d2d02e6ef5fa5c4f65b6b87b344b58cc0a3867" exitCode=255 Mar 01 09:08:48 crc kubenswrapper[4792]: I0301 09:08:48.762502 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"1e4a35ca4af33a2e93167010c2d2d02e6ef5fa5c4f65b6b87b344b58cc0a3867"} Mar 01 09:08:48 crc kubenswrapper[4792]: I0301 09:08:48.762540 4792 scope.go:117] "RemoveContainer" containerID="684aa0b5aeaecf3fb572ab400e718956a5373c4fa6651e742d65bdbc3425b9b2" Mar 01 09:08:49 crc kubenswrapper[4792]: I0301 09:08:49.351651 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:49 crc kubenswrapper[4792]: I0301 09:08:49.765532 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 01 09:08:49 crc kubenswrapper[4792]: I0301 09:08:49.766870 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f8130bb3020d8a95b40f9675531603ec0382a75d197b88e9b13b7f3bf3c0ca5c"} Mar 01 09:08:49 crc kubenswrapper[4792]: I0301 09:08:49.766984 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:49 crc kubenswrapper[4792]: I0301 09:08:49.767712 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:49 crc kubenswrapper[4792]: I0301 09:08:49.767741 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:49 crc kubenswrapper[4792]: I0301 09:08:49.767752 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:50 crc kubenswrapper[4792]: I0301 09:08:50.350199 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:50 crc kubenswrapper[4792]: I0301 09:08:50.769299 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:50 crc kubenswrapper[4792]: I0301 09:08:50.770956 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:50 crc kubenswrapper[4792]: I0301 09:08:50.771011 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:50 crc kubenswrapper[4792]: I0301 09:08:50.771027 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:51 crc kubenswrapper[4792]: I0301 09:08:51.017734 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:08:51 crc kubenswrapper[4792]: I0301 09:08:51.017922 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:51 crc kubenswrapper[4792]: I0301 09:08:51.018978 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:51 crc kubenswrapper[4792]: I0301 09:08:51.019088 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:51 crc kubenswrapper[4792]: I0301 09:08:51.019158 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:51 crc kubenswrapper[4792]: I0301 09:08:51.019748 4792 scope.go:117] "RemoveContainer" containerID="63c6c961bf7ee7b8486397f8898f6cd31605a2d02f3c3b3627c7c906e42eb44c" Mar 01 09:08:51 crc kubenswrapper[4792]: E0301 09:08:51.020017 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:08:51 crc kubenswrapper[4792]: I0301 09:08:51.352056 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:51 crc kubenswrapper[4792]: E0301 09:08:51.498188 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 01 09:08:52 crc kubenswrapper[4792]: I0301 09:08:52.203083 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:08:52 crc kubenswrapper[4792]: I0301 09:08:52.203285 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:52 crc kubenswrapper[4792]: I0301 09:08:52.204371 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:52 crc kubenswrapper[4792]: I0301 09:08:52.204492 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:52 crc kubenswrapper[4792]: I0301 09:08:52.204556 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:52 crc kubenswrapper[4792]: I0301 09:08:52.205223 4792 scope.go:117] "RemoveContainer" containerID="63c6c961bf7ee7b8486397f8898f6cd31605a2d02f3c3b3627c7c906e42eb44c" Mar 01 09:08:52 crc kubenswrapper[4792]: E0301 09:08:52.205457 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:08:52 crc kubenswrapper[4792]: I0301 09:08:52.350864 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:53 crc kubenswrapper[4792]: I0301 09:08:53.351245 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:54 crc kubenswrapper[4792]: I0301 09:08:54.294487 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:54 crc kubenswrapper[4792]: I0301 09:08:54.295839 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:54 crc kubenswrapper[4792]: I0301 09:08:54.295972 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:54 crc kubenswrapper[4792]: I0301 09:08:54.296105 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:54 crc kubenswrapper[4792]: I0301 09:08:54.296211 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 01 09:08:54 crc kubenswrapper[4792]: E0301 09:08:54.301031 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 01 09:08:54 crc kubenswrapper[4792]: E0301 09:08:54.301126 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 01 09:08:54 crc kubenswrapper[4792]: I0301 09:08:54.348331 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:55 crc kubenswrapper[4792]: I0301 09:08:55.349989 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:55 crc kubenswrapper[4792]: I0301 09:08:55.498728 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:08:55 crc kubenswrapper[4792]: I0301 09:08:55.498929 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:55 crc kubenswrapper[4792]: I0301 09:08:55.499979 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:55 crc kubenswrapper[4792]: I0301 09:08:55.500114 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:55 crc kubenswrapper[4792]: I0301 09:08:55.500185 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:55 crc kubenswrapper[4792]: I0301 09:08:55.502928 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:08:55 crc kubenswrapper[4792]: I0301 09:08:55.780838 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:55 crc kubenswrapper[4792]: I0301 09:08:55.780933 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:08:55 crc kubenswrapper[4792]: I0301 09:08:55.782040 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:55 crc kubenswrapper[4792]: I0301 09:08:55.782079 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:55 crc kubenswrapper[4792]: I0301 09:08:55.782090 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:56 crc kubenswrapper[4792]: I0301 09:08:56.354022 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:56 crc kubenswrapper[4792]: I0301 09:08:56.784518 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:56 crc kubenswrapper[4792]: I0301 09:08:56.786309 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:56 crc kubenswrapper[4792]: I0301 09:08:56.786359 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:56 crc kubenswrapper[4792]: I0301 09:08:56.786374 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:08:57 crc kubenswrapper[4792]: I0301 09:08:57.351179 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:57 crc kubenswrapper[4792]: W0301 09:08:57.852027 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 01 09:08:57 crc kubenswrapper[4792]: E0301 09:08:57.852090 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 01 09:08:58 crc kubenswrapper[4792]: I0301 09:08:58.350132 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:59 crc kubenswrapper[4792]: I0301 09:08:59.351787 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:08:59 crc kubenswrapper[4792]: I0301 09:08:59.354442 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:08:59 crc kubenswrapper[4792]: I0301 09:08:59.354624 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:08:59 crc kubenswrapper[4792]: I0301 09:08:59.356027 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:08:59 crc kubenswrapper[4792]: I0301 09:08:59.356137 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:08:59 crc kubenswrapper[4792]: I0301 09:08:59.356406 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:00 crc kubenswrapper[4792]: I0301 09:09:00.350484 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:09:01 crc kubenswrapper[4792]: I0301 09:09:01.301889 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:09:01 crc kubenswrapper[4792]: I0301 09:09:01.303090 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:01 crc kubenswrapper[4792]: I0301 09:09:01.303133 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:01 crc kubenswrapper[4792]: I0301 09:09:01.303144 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:01 crc kubenswrapper[4792]: I0301 09:09:01.303169 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 01 09:09:01 crc kubenswrapper[4792]: E0301 09:09:01.305732 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 01 09:09:01 crc kubenswrapper[4792]: E0301 09:09:01.305922 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 01 09:09:01 crc kubenswrapper[4792]: I0301 09:09:01.350291 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:09:01 crc kubenswrapper[4792]: E0301 09:09:01.498654 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 01 09:09:02 crc kubenswrapper[4792]: I0301 09:09:02.168467 4792 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 01 09:09:02 crc kubenswrapper[4792]: I0301 09:09:02.179759 4792 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 01 09:09:02 crc kubenswrapper[4792]: I0301 09:09:02.350828 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:09:03 crc kubenswrapper[4792]: I0301 09:09:03.350324 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:09:03 crc kubenswrapper[4792]: I0301 09:09:03.408016 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:09:03 crc kubenswrapper[4792]: I0301 09:09:03.409175 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:03 crc kubenswrapper[4792]: I0301 09:09:03.409207 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:03 crc kubenswrapper[4792]: I0301 09:09:03.409218 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:03 crc kubenswrapper[4792]: I0301 09:09:03.409734 4792 scope.go:117] "RemoveContainer" containerID="63c6c961bf7ee7b8486397f8898f6cd31605a2d02f3c3b3627c7c906e42eb44c" Mar 01 09:09:03 crc kubenswrapper[4792]: E0301 09:09:03.409968 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:09:04 crc kubenswrapper[4792]: I0301 09:09:04.351189 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:09:05 crc kubenswrapper[4792]: I0301 09:09:05.352522 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:09:06 crc kubenswrapper[4792]: I0301 09:09:06.352967 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 01 09:09:06 crc kubenswrapper[4792]: I0301 09:09:06.372730 4792 csr.go:261] certificate signing request csr-j9p7j is approved, waiting to be issued Mar 01 09:09:06 crc kubenswrapper[4792]: I0301 09:09:06.380125 4792 csr.go:257] certificate signing request csr-j9p7j is issued Mar 01 09:09:06 crc kubenswrapper[4792]: I0301 09:09:06.419348 4792 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 01 09:09:06 crc kubenswrapper[4792]: I0301 09:09:06.723560 4792 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 01 09:09:07 crc kubenswrapper[4792]: I0301 09:09:07.204878 4792 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 01 09:09:07 crc kubenswrapper[4792]: W0301 09:09:07.205086 4792 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.RuntimeClass ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Mar 01 09:09:07 crc kubenswrapper[4792]: I0301 09:09:07.382070 4792 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-23 03:27:09.919042952 +0000 UTC Mar 01 09:09:07 crc kubenswrapper[4792]: I0301 09:09:07.382969 4792 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6402h18m2.536085007s for next certificate rotation Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.306420 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.307741 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.307792 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.307804 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.307948 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.315663 4792 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.315986 4792 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 01 09:09:08 crc kubenswrapper[4792]: E0301 09:09:08.316020 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.319425 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.319516 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.319580 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.319639 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.319694 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:08Z","lastTransitionTime":"2026-03-01T09:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:08 crc kubenswrapper[4792]: E0301 09:09:08.337045 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.345367 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.345520 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.345529 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.345558 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.345568 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:08Z","lastTransitionTime":"2026-03-01T09:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:08 crc kubenswrapper[4792]: E0301 09:09:08.356341 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.364779 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.364809 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.364818 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.364833 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.364843 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:08Z","lastTransitionTime":"2026-03-01T09:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:08 crc kubenswrapper[4792]: E0301 09:09:08.373759 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.381001 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.381206 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.381403 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.381701 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:08 crc kubenswrapper[4792]: I0301 09:09:08.382026 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:08Z","lastTransitionTime":"2026-03-01T09:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:08 crc kubenswrapper[4792]: E0301 09:09:08.391410 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:08 crc kubenswrapper[4792]: E0301 09:09:08.391516 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 01 09:09:08 crc kubenswrapper[4792]: E0301 09:09:08.391537 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:08 crc kubenswrapper[4792]: E0301 09:09:08.492598 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:08 crc kubenswrapper[4792]: E0301 09:09:08.593337 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:08 crc kubenswrapper[4792]: E0301 09:09:08.694263 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:08 crc kubenswrapper[4792]: E0301 09:09:08.794672 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:08 crc kubenswrapper[4792]: E0301 09:09:08.895193 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:08 crc kubenswrapper[4792]: E0301 09:09:08.995838 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:09 crc kubenswrapper[4792]: E0301 09:09:09.096825 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:09 crc kubenswrapper[4792]: E0301 09:09:09.197694 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:09 crc kubenswrapper[4792]: E0301 09:09:09.298370 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:09 crc kubenswrapper[4792]: E0301 09:09:09.399222 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:09 crc kubenswrapper[4792]: I0301 09:09:09.408625 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:09:09 crc kubenswrapper[4792]: I0301 09:09:09.409780 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:09 crc kubenswrapper[4792]: I0301 09:09:09.409815 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:09 crc kubenswrapper[4792]: I0301 09:09:09.409824 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:09 crc kubenswrapper[4792]: E0301 09:09:09.500333 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:09 crc kubenswrapper[4792]: E0301 09:09:09.600951 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:09 crc kubenswrapper[4792]: E0301 09:09:09.702049 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:09 crc kubenswrapper[4792]: E0301 09:09:09.803091 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:09 crc kubenswrapper[4792]: E0301 09:09:09.903987 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:10 crc kubenswrapper[4792]: E0301 09:09:10.005071 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:10 crc kubenswrapper[4792]: E0301 09:09:10.106189 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:10 crc kubenswrapper[4792]: E0301 09:09:10.206589 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:10 crc kubenswrapper[4792]: E0301 09:09:10.307151 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:10 crc kubenswrapper[4792]: E0301 09:09:10.407433 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:10 crc kubenswrapper[4792]: E0301 09:09:10.508372 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:10 crc kubenswrapper[4792]: E0301 09:09:10.608469 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:10 crc kubenswrapper[4792]: E0301 09:09:10.709429 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:10 crc kubenswrapper[4792]: E0301 09:09:10.809877 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:10 crc kubenswrapper[4792]: E0301 09:09:10.910959 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:11 crc kubenswrapper[4792]: E0301 09:09:11.011704 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:11 crc kubenswrapper[4792]: E0301 09:09:11.112800 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:11 crc kubenswrapper[4792]: E0301 09:09:11.213947 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:11 crc kubenswrapper[4792]: E0301 09:09:11.314341 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:11 crc kubenswrapper[4792]: E0301 09:09:11.414838 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:11 crc kubenswrapper[4792]: E0301 09:09:11.498976 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 01 09:09:11 crc kubenswrapper[4792]: E0301 09:09:11.515492 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:11 crc kubenswrapper[4792]: E0301 09:09:11.616359 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:11 crc kubenswrapper[4792]: E0301 09:09:11.717039 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:11 crc kubenswrapper[4792]: I0301 09:09:11.729997 4792 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 01 09:09:11 crc kubenswrapper[4792]: E0301 09:09:11.817189 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:11 crc kubenswrapper[4792]: E0301 09:09:11.918087 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:12 crc kubenswrapper[4792]: E0301 09:09:12.019206 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:12 crc kubenswrapper[4792]: E0301 09:09:12.120036 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:12 crc kubenswrapper[4792]: E0301 09:09:12.220173 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:12 crc kubenswrapper[4792]: E0301 09:09:12.320375 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:12 crc kubenswrapper[4792]: E0301 09:09:12.421231 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:12 crc kubenswrapper[4792]: E0301 09:09:12.521486 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:12 crc kubenswrapper[4792]: E0301 09:09:12.622508 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:12 crc kubenswrapper[4792]: E0301 09:09:12.723041 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:12 crc kubenswrapper[4792]: E0301 09:09:12.823287 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:12 crc kubenswrapper[4792]: E0301 09:09:12.923429 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:13 crc kubenswrapper[4792]: E0301 09:09:13.024199 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:13 crc kubenswrapper[4792]: E0301 09:09:13.125182 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:13 crc kubenswrapper[4792]: E0301 09:09:13.226068 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:13 crc kubenswrapper[4792]: E0301 09:09:13.326684 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:13 crc kubenswrapper[4792]: E0301 09:09:13.426977 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:13 crc kubenswrapper[4792]: E0301 09:09:13.527842 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:13 crc kubenswrapper[4792]: E0301 09:09:13.628291 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:13 crc kubenswrapper[4792]: E0301 09:09:13.729322 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:13 crc kubenswrapper[4792]: E0301 09:09:13.830001 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:13 crc kubenswrapper[4792]: E0301 09:09:13.931438 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:14 crc kubenswrapper[4792]: E0301 09:09:14.031584 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:14 crc kubenswrapper[4792]: E0301 09:09:14.133088 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:14 crc kubenswrapper[4792]: E0301 09:09:14.233937 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:14 crc kubenswrapper[4792]: E0301 09:09:14.334289 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:14 crc kubenswrapper[4792]: E0301 09:09:14.434887 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:14 crc kubenswrapper[4792]: E0301 09:09:14.535986 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:14 crc kubenswrapper[4792]: E0301 09:09:14.637047 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:14 crc kubenswrapper[4792]: E0301 09:09:14.738261 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:14 crc kubenswrapper[4792]: E0301 09:09:14.838434 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:14 crc kubenswrapper[4792]: E0301 09:09:14.939411 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:15 crc kubenswrapper[4792]: E0301 09:09:15.040393 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:15 crc kubenswrapper[4792]: E0301 09:09:15.141452 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:15 crc kubenswrapper[4792]: E0301 09:09:15.242341 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:15 crc kubenswrapper[4792]: E0301 09:09:15.342718 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:15 crc kubenswrapper[4792]: E0301 09:09:15.444250 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:15 crc kubenswrapper[4792]: E0301 09:09:15.544383 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:15 crc kubenswrapper[4792]: E0301 09:09:15.645292 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:15 crc kubenswrapper[4792]: E0301 09:09:15.745989 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:15 crc kubenswrapper[4792]: E0301 09:09:15.846440 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:15 crc kubenswrapper[4792]: E0301 09:09:15.947616 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:16 crc kubenswrapper[4792]: E0301 09:09:16.048502 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:16 crc kubenswrapper[4792]: E0301 09:09:16.149751 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:16 crc kubenswrapper[4792]: E0301 09:09:16.250973 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:16 crc kubenswrapper[4792]: E0301 09:09:16.351865 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:16 crc kubenswrapper[4792]: E0301 09:09:16.452782 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:16 crc kubenswrapper[4792]: E0301 09:09:16.553587 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:16 crc kubenswrapper[4792]: E0301 09:09:16.654381 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:16 crc kubenswrapper[4792]: E0301 09:09:16.755310 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:16 crc kubenswrapper[4792]: E0301 09:09:16.855614 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:16 crc kubenswrapper[4792]: E0301 09:09:16.956754 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:17 crc kubenswrapper[4792]: E0301 09:09:17.057052 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:17 crc kubenswrapper[4792]: E0301 09:09:17.157710 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:17 crc kubenswrapper[4792]: E0301 09:09:17.258545 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:17 crc kubenswrapper[4792]: E0301 09:09:17.359272 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:17 crc kubenswrapper[4792]: E0301 09:09:17.460408 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:17 crc kubenswrapper[4792]: E0301 09:09:17.561136 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:17 crc kubenswrapper[4792]: E0301 09:09:17.662233 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:17 crc kubenswrapper[4792]: E0301 09:09:17.762833 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:17 crc kubenswrapper[4792]: E0301 09:09:17.863667 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:17 crc kubenswrapper[4792]: E0301 09:09:17.964462 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:18 crc kubenswrapper[4792]: E0301 09:09:18.065336 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:18 crc kubenswrapper[4792]: E0301 09:09:18.166167 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:18 crc kubenswrapper[4792]: E0301 09:09:18.267252 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:18 crc kubenswrapper[4792]: E0301 09:09:18.367899 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.408475 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.409850 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.409890 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.409920 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.410493 4792 scope.go:117] "RemoveContainer" containerID="63c6c961bf7ee7b8486397f8898f6cd31605a2d02f3c3b3627c7c906e42eb44c" Mar 01 09:09:18 crc kubenswrapper[4792]: E0301 09:09:18.410640 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:09:18 crc kubenswrapper[4792]: E0301 09:09:18.468476 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:18 crc kubenswrapper[4792]: E0301 09:09:18.569373 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:18 crc kubenswrapper[4792]: E0301 09:09:18.652221 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.656263 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.656294 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.656305 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.656320 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.656331 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:18Z","lastTransitionTime":"2026-03-01T09:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:18 crc kubenswrapper[4792]: E0301 09:09:18.665782 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.669524 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.669569 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.669585 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.669605 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.669620 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:18Z","lastTransitionTime":"2026-03-01T09:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:18 crc kubenswrapper[4792]: E0301 09:09:18.680084 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.684107 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.684151 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.684166 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.684186 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.684200 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:18Z","lastTransitionTime":"2026-03-01T09:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:18 crc kubenswrapper[4792]: E0301 09:09:18.695567 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.699226 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.699288 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.699306 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.699330 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:18 crc kubenswrapper[4792]: I0301 09:09:18.699345 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:18Z","lastTransitionTime":"2026-03-01T09:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:18 crc kubenswrapper[4792]: E0301 09:09:18.708215 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:18 crc kubenswrapper[4792]: E0301 09:09:18.708329 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 01 09:09:18 crc kubenswrapper[4792]: E0301 09:09:18.708354 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:18 crc kubenswrapper[4792]: E0301 09:09:18.809290 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:18 crc kubenswrapper[4792]: E0301 09:09:18.910172 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:19 crc kubenswrapper[4792]: E0301 09:09:19.010954 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:19 crc kubenswrapper[4792]: E0301 09:09:19.111583 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:19 crc kubenswrapper[4792]: E0301 09:09:19.212346 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:19 crc kubenswrapper[4792]: E0301 09:09:19.313268 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:19 crc kubenswrapper[4792]: E0301 09:09:19.414305 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:19 crc kubenswrapper[4792]: E0301 09:09:19.515114 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:19 crc kubenswrapper[4792]: E0301 09:09:19.615944 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:19 crc kubenswrapper[4792]: E0301 09:09:19.716855 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:19 crc kubenswrapper[4792]: E0301 09:09:19.817183 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:19 crc kubenswrapper[4792]: E0301 09:09:19.917925 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:20 crc kubenswrapper[4792]: E0301 09:09:20.018606 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:20 crc kubenswrapper[4792]: E0301 09:09:20.119246 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:20 crc kubenswrapper[4792]: E0301 09:09:20.220288 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:20 crc kubenswrapper[4792]: E0301 09:09:20.321385 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:20 crc kubenswrapper[4792]: E0301 09:09:20.422179 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:20 crc kubenswrapper[4792]: E0301 09:09:20.523037 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:20 crc kubenswrapper[4792]: E0301 09:09:20.624046 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:20 crc kubenswrapper[4792]: E0301 09:09:20.724324 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:20 crc kubenswrapper[4792]: E0301 09:09:20.824502 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:20 crc kubenswrapper[4792]: E0301 09:09:20.925223 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:21 crc kubenswrapper[4792]: E0301 09:09:21.025314 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:21 crc kubenswrapper[4792]: E0301 09:09:21.126276 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:21 crc kubenswrapper[4792]: E0301 09:09:21.226570 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:21 crc kubenswrapper[4792]: E0301 09:09:21.326693 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:21 crc kubenswrapper[4792]: E0301 09:09:21.427074 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:21 crc kubenswrapper[4792]: E0301 09:09:21.499165 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 01 09:09:21 crc kubenswrapper[4792]: E0301 09:09:21.527606 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:21 crc kubenswrapper[4792]: E0301 09:09:21.628181 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:21 crc kubenswrapper[4792]: E0301 09:09:21.728769 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:21 crc kubenswrapper[4792]: E0301 09:09:21.829752 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:21 crc kubenswrapper[4792]: E0301 09:09:21.930819 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:22 crc kubenswrapper[4792]: E0301 09:09:22.031529 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:22 crc kubenswrapper[4792]: E0301 09:09:22.132190 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:22 crc kubenswrapper[4792]: E0301 09:09:22.233025 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:22 crc kubenswrapper[4792]: E0301 09:09:22.333117 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:22 crc kubenswrapper[4792]: E0301 09:09:22.434105 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:22 crc kubenswrapper[4792]: E0301 09:09:22.535161 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:22 crc kubenswrapper[4792]: E0301 09:09:22.635990 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:22 crc kubenswrapper[4792]: E0301 09:09:22.736181 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:22 crc kubenswrapper[4792]: E0301 09:09:22.836291 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:22 crc kubenswrapper[4792]: E0301 09:09:22.936393 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:23 crc kubenswrapper[4792]: E0301 09:09:23.037010 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:23 crc kubenswrapper[4792]: E0301 09:09:23.137581 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:23 crc kubenswrapper[4792]: E0301 09:09:23.238397 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:23 crc kubenswrapper[4792]: E0301 09:09:23.339360 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:23 crc kubenswrapper[4792]: E0301 09:09:23.440031 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:23 crc kubenswrapper[4792]: E0301 09:09:23.541170 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:23 crc kubenswrapper[4792]: E0301 09:09:23.642098 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:23 crc kubenswrapper[4792]: E0301 09:09:23.742617 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:23 crc kubenswrapper[4792]: E0301 09:09:23.844114 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:23 crc kubenswrapper[4792]: E0301 09:09:23.945173 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:24 crc kubenswrapper[4792]: E0301 09:09:24.045815 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:24 crc kubenswrapper[4792]: E0301 09:09:24.146616 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:24 crc kubenswrapper[4792]: E0301 09:09:24.247231 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:24 crc kubenswrapper[4792]: E0301 09:09:24.347805 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:24 crc kubenswrapper[4792]: I0301 09:09:24.408665 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:09:24 crc kubenswrapper[4792]: I0301 09:09:24.409665 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:24 crc kubenswrapper[4792]: I0301 09:09:24.409732 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:24 crc kubenswrapper[4792]: I0301 09:09:24.409752 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:24 crc kubenswrapper[4792]: E0301 09:09:24.448735 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:24 crc kubenswrapper[4792]: E0301 09:09:24.549695 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:24 crc kubenswrapper[4792]: E0301 09:09:24.650655 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:24 crc kubenswrapper[4792]: E0301 09:09:24.751074 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:24 crc kubenswrapper[4792]: E0301 09:09:24.851693 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:24 crc kubenswrapper[4792]: E0301 09:09:24.951897 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:25 crc kubenswrapper[4792]: E0301 09:09:25.052491 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:25 crc kubenswrapper[4792]: E0301 09:09:25.152576 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:25 crc kubenswrapper[4792]: E0301 09:09:25.252868 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:25 crc kubenswrapper[4792]: E0301 09:09:25.354001 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:25 crc kubenswrapper[4792]: E0301 09:09:25.455083 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:25 crc kubenswrapper[4792]: E0301 09:09:25.556128 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:25 crc kubenswrapper[4792]: E0301 09:09:25.656227 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:25 crc kubenswrapper[4792]: E0301 09:09:25.756519 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:25 crc kubenswrapper[4792]: E0301 09:09:25.856829 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:25 crc kubenswrapper[4792]: E0301 09:09:25.957479 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:26 crc kubenswrapper[4792]: E0301 09:09:26.058102 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:26 crc kubenswrapper[4792]: E0301 09:09:26.158976 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:26 crc kubenswrapper[4792]: E0301 09:09:26.259827 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:26 crc kubenswrapper[4792]: E0301 09:09:26.360024 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:26 crc kubenswrapper[4792]: E0301 09:09:26.461173 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:26 crc kubenswrapper[4792]: E0301 09:09:26.562178 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:26 crc kubenswrapper[4792]: E0301 09:09:26.663031 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:26 crc kubenswrapper[4792]: E0301 09:09:26.763794 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:26 crc kubenswrapper[4792]: E0301 09:09:26.864563 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:26 crc kubenswrapper[4792]: E0301 09:09:26.965414 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:27 crc kubenswrapper[4792]: E0301 09:09:27.066377 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:27 crc kubenswrapper[4792]: E0301 09:09:27.167343 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:27 crc kubenswrapper[4792]: E0301 09:09:27.267726 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:27 crc kubenswrapper[4792]: E0301 09:09:27.368647 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:27 crc kubenswrapper[4792]: E0301 09:09:27.470110 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:27 crc kubenswrapper[4792]: E0301 09:09:27.570931 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:27 crc kubenswrapper[4792]: E0301 09:09:27.672599 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:27 crc kubenswrapper[4792]: E0301 09:09:27.773640 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:27 crc kubenswrapper[4792]: E0301 09:09:27.874279 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:27 crc kubenswrapper[4792]: E0301 09:09:27.974408 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:28 crc kubenswrapper[4792]: E0301 09:09:28.075052 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:28 crc kubenswrapper[4792]: E0301 09:09:28.175373 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:28 crc kubenswrapper[4792]: E0301 09:09:28.276402 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:28 crc kubenswrapper[4792]: E0301 09:09:28.376804 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:28 crc kubenswrapper[4792]: E0301 09:09:28.477191 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:28 crc kubenswrapper[4792]: E0301 09:09:28.577422 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:28 crc kubenswrapper[4792]: E0301 09:09:28.678560 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:28 crc kubenswrapper[4792]: E0301 09:09:28.779100 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:28 crc kubenswrapper[4792]: E0301 09:09:28.879376 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:28 crc kubenswrapper[4792]: E0301 09:09:28.980409 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:28 crc kubenswrapper[4792]: E0301 09:09:28.992630 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 01 09:09:28 crc kubenswrapper[4792]: I0301 09:09:28.994057 4792 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 01 09:09:28 crc kubenswrapper[4792]: I0301 09:09:28.996010 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:28 crc kubenswrapper[4792]: I0301 09:09:28.996079 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:28 crc kubenswrapper[4792]: I0301 09:09:28.996092 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:28 crc kubenswrapper[4792]: I0301 09:09:28.996110 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:28 crc kubenswrapper[4792]: I0301 09:09:28.996122 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:28Z","lastTransitionTime":"2026-03-01T09:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:29 crc kubenswrapper[4792]: E0301 09:09:29.005497 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:29 crc kubenswrapper[4792]: I0301 09:09:29.008702 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:29 crc kubenswrapper[4792]: I0301 09:09:29.008761 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:29 crc kubenswrapper[4792]: I0301 09:09:29.008773 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:29 crc kubenswrapper[4792]: I0301 09:09:29.008787 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:29 crc kubenswrapper[4792]: I0301 09:09:29.008798 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:29Z","lastTransitionTime":"2026-03-01T09:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:29 crc kubenswrapper[4792]: E0301 09:09:29.019775 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:29 crc kubenswrapper[4792]: I0301 09:09:29.022979 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:29 crc kubenswrapper[4792]: I0301 09:09:29.023015 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:29 crc kubenswrapper[4792]: I0301 09:09:29.023041 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:29 crc kubenswrapper[4792]: I0301 09:09:29.023065 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:29 crc kubenswrapper[4792]: I0301 09:09:29.023082 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:29Z","lastTransitionTime":"2026-03-01T09:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:29 crc kubenswrapper[4792]: E0301 09:09:29.034153 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:29 crc kubenswrapper[4792]: I0301 09:09:29.037975 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:29 crc kubenswrapper[4792]: I0301 09:09:29.038011 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:29 crc kubenswrapper[4792]: I0301 09:09:29.038020 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:29 crc kubenswrapper[4792]: I0301 09:09:29.038032 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:29 crc kubenswrapper[4792]: I0301 09:09:29.038041 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:29Z","lastTransitionTime":"2026-03-01T09:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:29 crc kubenswrapper[4792]: E0301 09:09:29.046654 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:29 crc kubenswrapper[4792]: E0301 09:09:29.047015 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 01 09:09:29 crc kubenswrapper[4792]: E0301 09:09:29.081236 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:29 crc kubenswrapper[4792]: E0301 09:09:29.182178 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:29 crc kubenswrapper[4792]: E0301 09:09:29.283049 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:29 crc kubenswrapper[4792]: E0301 09:09:29.383996 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:29 crc kubenswrapper[4792]: E0301 09:09:29.484884 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:29 crc kubenswrapper[4792]: E0301 09:09:29.585761 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:29 crc kubenswrapper[4792]: E0301 09:09:29.687580 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:29 crc kubenswrapper[4792]: E0301 09:09:29.788744 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:29 crc kubenswrapper[4792]: E0301 09:09:29.889179 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:29 crc kubenswrapper[4792]: E0301 09:09:29.990124 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:30 crc kubenswrapper[4792]: E0301 09:09:30.090785 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:30 crc kubenswrapper[4792]: E0301 09:09:30.191318 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:30 crc kubenswrapper[4792]: E0301 09:09:30.292423 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:30 crc kubenswrapper[4792]: E0301 09:09:30.392760 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:30 crc kubenswrapper[4792]: E0301 09:09:30.494129 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:30 crc kubenswrapper[4792]: E0301 09:09:30.594290 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:30 crc kubenswrapper[4792]: E0301 09:09:30.695345 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:30 crc kubenswrapper[4792]: E0301 09:09:30.795951 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:30 crc kubenswrapper[4792]: E0301 09:09:30.896126 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:30 crc kubenswrapper[4792]: E0301 09:09:30.997322 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:31 crc kubenswrapper[4792]: E0301 09:09:31.098297 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:31 crc kubenswrapper[4792]: E0301 09:09:31.198882 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:31 crc kubenswrapper[4792]: E0301 09:09:31.299803 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:31 crc kubenswrapper[4792]: E0301 09:09:31.400956 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:31 crc kubenswrapper[4792]: I0301 09:09:31.408456 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:09:31 crc kubenswrapper[4792]: I0301 09:09:31.410332 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:31 crc kubenswrapper[4792]: I0301 09:09:31.410419 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:31 crc kubenswrapper[4792]: I0301 09:09:31.410438 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:31 crc kubenswrapper[4792]: I0301 09:09:31.411773 4792 scope.go:117] "RemoveContainer" containerID="63c6c961bf7ee7b8486397f8898f6cd31605a2d02f3c3b3627c7c906e42eb44c" Mar 01 09:09:31 crc kubenswrapper[4792]: E0301 09:09:31.500040 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 01 09:09:31 crc kubenswrapper[4792]: E0301 09:09:31.501133 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:31 crc kubenswrapper[4792]: E0301 09:09:31.601819 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:31 crc kubenswrapper[4792]: E0301 09:09:31.702189 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:31 crc kubenswrapper[4792]: E0301 09:09:31.802570 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:31 crc kubenswrapper[4792]: I0301 09:09:31.865985 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 01 09:09:31 crc kubenswrapper[4792]: I0301 09:09:31.867108 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2"} Mar 01 09:09:31 crc kubenswrapper[4792]: I0301 09:09:31.867222 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:09:31 crc kubenswrapper[4792]: I0301 09:09:31.867883 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:31 crc kubenswrapper[4792]: I0301 09:09:31.867918 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:31 crc kubenswrapper[4792]: I0301 09:09:31.867926 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:31 crc kubenswrapper[4792]: E0301 09:09:31.903472 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:32 crc kubenswrapper[4792]: E0301 09:09:32.004263 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:32 crc kubenswrapper[4792]: E0301 09:09:32.105398 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:32 crc kubenswrapper[4792]: I0301 09:09:32.202322 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:09:32 crc kubenswrapper[4792]: E0301 09:09:32.206535 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:32 crc kubenswrapper[4792]: E0301 09:09:32.306866 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:32 crc kubenswrapper[4792]: E0301 09:09:32.407101 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:32 crc kubenswrapper[4792]: E0301 09:09:32.507647 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:32 crc kubenswrapper[4792]: E0301 09:09:32.608192 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:32 crc kubenswrapper[4792]: E0301 09:09:32.708951 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:32 crc kubenswrapper[4792]: E0301 09:09:32.810184 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:32 crc kubenswrapper[4792]: I0301 09:09:32.872174 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 01 09:09:32 crc kubenswrapper[4792]: I0301 09:09:32.872600 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 01 09:09:32 crc kubenswrapper[4792]: I0301 09:09:32.874471 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2" exitCode=255 Mar 01 09:09:32 crc kubenswrapper[4792]: I0301 09:09:32.874670 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2"} Mar 01 09:09:32 crc kubenswrapper[4792]: I0301 09:09:32.874836 4792 scope.go:117] "RemoveContainer" containerID="63c6c961bf7ee7b8486397f8898f6cd31605a2d02f3c3b3627c7c906e42eb44c" Mar 01 09:09:32 crc kubenswrapper[4792]: I0301 09:09:32.875210 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 01 09:09:32 crc kubenswrapper[4792]: I0301 09:09:32.876487 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:32 crc kubenswrapper[4792]: I0301 09:09:32.876517 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:32 crc kubenswrapper[4792]: I0301 09:09:32.876527 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:32 crc kubenswrapper[4792]: I0301 09:09:32.877042 4792 scope.go:117] "RemoveContainer" containerID="40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2" Mar 01 09:09:32 crc kubenswrapper[4792]: E0301 09:09:32.877202 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:09:32 crc kubenswrapper[4792]: E0301 09:09:32.911420 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:33 crc kubenswrapper[4792]: E0301 09:09:33.012232 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.091148 4792 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.114799 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.114834 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.114843 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.114856 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.114866 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:33Z","lastTransitionTime":"2026-03-01T09:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.218061 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.218114 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.218131 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.218155 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.218173 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:33Z","lastTransitionTime":"2026-03-01T09:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.321215 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.321245 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.321254 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.321266 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.321276 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:33Z","lastTransitionTime":"2026-03-01T09:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.423766 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.423813 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.423824 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.423840 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.423851 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:33Z","lastTransitionTime":"2026-03-01T09:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.526545 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.526581 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.526592 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.526606 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.526618 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:33Z","lastTransitionTime":"2026-03-01T09:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.629961 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.630020 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.630033 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.630051 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.630062 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:33Z","lastTransitionTime":"2026-03-01T09:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.732799 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.732880 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.732935 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.732984 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.733008 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:33Z","lastTransitionTime":"2026-03-01T09:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.754831 4792 apiserver.go:52] "Watching apiserver" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.759449 4792 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.759765 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.760254 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.760376 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.760390 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:33 crc kubenswrapper[4792]: E0301 09:09:33.760629 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.760668 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:33 crc kubenswrapper[4792]: E0301 09:09:33.760883 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.760387 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:33 crc kubenswrapper[4792]: E0301 09:09:33.761151 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.760724 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.763900 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.764357 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.764465 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.764589 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.765088 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.765998 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.767432 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.767691 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.768518 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.787519 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.800939 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.813027 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.822578 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.835145 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.835206 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.835227 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.835257 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.835282 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:33Z","lastTransitionTime":"2026-03-01T09:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.841878 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.852413 4792 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.852756 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.864104 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.872509 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.878451 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.887561 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.893721 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.894281 4792 scope.go:117] "RemoveContainer" containerID="40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2" Mar 01 09:09:33 crc kubenswrapper[4792]: E0301 09:09:33.894661 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.900969 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.905765 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.905810 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.905839 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.905864 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.905890 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.905932 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.905956 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.905979 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906000 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906022 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906045 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906067 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906089 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906111 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906132 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906154 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906174 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906199 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906221 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906243 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906265 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906287 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906308 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906331 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906354 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906374 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906394 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906416 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906438 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906459 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906480 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906501 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906521 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906544 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906569 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906590 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906612 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906634 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906656 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906678 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906701 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906723 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906746 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906766 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906791 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906813 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906834 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906855 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906877 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906898 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906939 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906961 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906983 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907004 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907025 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907047 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907070 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907091 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907112 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906328 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906448 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907158 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907183 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907205 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907228 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907250 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907272 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907293 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907315 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907353 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907377 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907400 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907426 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907448 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907471 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907493 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907516 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907538 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907560 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907583 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907605 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907626 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907650 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907672 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907694 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907718 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907741 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907764 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907786 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907807 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907831 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907853 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907874 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907895 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907933 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907954 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907974 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908022 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908048 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908070 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908093 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908115 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908139 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908162 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908188 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908211 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908234 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908257 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908280 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908301 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908324 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908346 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908369 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908392 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908417 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908441 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914108 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914189 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914233 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914267 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914306 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914341 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914369 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914488 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914523 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914560 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914597 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914635 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914669 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914705 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914740 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914768 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914798 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914841 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914876 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914924 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914955 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914985 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915012 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915043 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915093 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915125 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915156 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915184 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915216 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915248 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915281 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915311 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915345 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915381 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915409 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915441 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915473 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915508 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915537 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915571 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915602 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915629 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915661 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915691 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915721 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915753 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915786 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915817 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915848 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915884 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915938 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915968 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915999 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916034 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916064 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916098 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916130 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916164 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916193 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916227 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916263 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916291 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916324 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916356 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916385 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916418 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916452 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916485 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916514 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916549 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916585 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916615 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916652 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916692 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916726 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916754 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916795 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916831 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916863 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916897 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916947 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.917031 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.917074 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.917116 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.917153 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.917215 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.917254 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.917288 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.917326 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.917362 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.917396 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.917432 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.917467 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.917499 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.917536 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.917667 4792 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.917693 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906709 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906854 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.906960 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907060 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907126 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907257 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907311 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907372 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907553 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907567 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907597 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907670 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907798 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.907988 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908104 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908142 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908174 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908254 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.908379 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.912245 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.912524 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.912763 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.913015 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.913276 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.913772 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914365 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914405 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914697 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.914835 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.921044 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.915789 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.916583 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.917287 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.917709 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.918362 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.918788 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.919061 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.919102 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.919443 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.919522 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.919574 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.919849 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.919900 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.920408 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.920414 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.922004 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.922152 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.922429 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.922567 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.922874 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.923283 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.923374 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.923391 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.923485 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.923693 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.923644 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.923835 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.924085 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.924267 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.924581 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.924641 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.929335 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.929494 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.929553 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.929632 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.929771 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.929782 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.929950 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.930049 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.930117 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.930203 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.930275 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.930401 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.930560 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.930822 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.930853 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.931005 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.931045 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.931207 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.931516 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.932020 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.932059 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.932859 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.933093 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.933316 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.934177 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.933873 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.923244 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.934311 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.934327 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.934626 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.934641 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.934662 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.934807 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.934848 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.935210 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: E0301 09:09:33.935500 4792 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.935541 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.935676 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.936239 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.937262 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.937710 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.937779 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.938362 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.938500 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.938811 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.939260 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.939259 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.939337 4792 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.939587 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.939721 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.939952 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.940293 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.940520 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.940607 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.940989 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.941070 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.941414 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.941555 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.941607 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.941760 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.942109 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.942420 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.942739 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.943068 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.943304 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.943751 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.944118 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.944437 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.944808 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.944823 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.945439 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.945764 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.946382 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.947291 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.947301 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.947341 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.947501 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.947523 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.947534 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.947550 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.947561 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:33Z","lastTransitionTime":"2026-03-01T09:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.948309 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.948791 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.949233 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.949493 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.949727 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.950067 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.950559 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.950554 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.951641 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.951520 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.951992 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.952062 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: E0301 09:09:33.952318 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:34.442266466 +0000 UTC m=+103.684145853 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.952590 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: E0301 09:09:33.953069 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 01 09:09:33 crc kubenswrapper[4792]: E0301 09:09:33.953097 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 01 09:09:33 crc kubenswrapper[4792]: E0301 09:09:33.953113 4792 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:33 crc kubenswrapper[4792]: E0301 09:09:33.953206 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:34.45315565 +0000 UTC m=+103.695034867 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.953747 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.956537 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: E0301 09:09:33.956619 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 01 09:09:33 crc kubenswrapper[4792]: E0301 09:09:33.956640 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 01 09:09:33 crc kubenswrapper[4792]: E0301 09:09:33.956654 4792 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:33 crc kubenswrapper[4792]: E0301 09:09:33.956691 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:34.456678679 +0000 UTC m=+103.698557896 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.957748 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.957814 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.957836 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.957972 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.958409 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.958441 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.958449 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.958728 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.958812 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.953814 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.954093 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.960510 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.961450 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.961902 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.961960 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.962624 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.962701 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.963365 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.963454 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.964046 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.964828 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.965048 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.965133 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.965371 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.966783 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.966977 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.968771 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.969200 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 01 09:09:33 crc kubenswrapper[4792]: E0301 09:09:33.969354 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:09:34.469329587 +0000 UTC m=+103.711208994 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.969812 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.969969 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.970312 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.970326 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.970409 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.970456 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.970511 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: E0301 09:09:33.971313 4792 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.971355 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.970723 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.970803 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: E0301 09:09:33.971395 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:34.471372399 +0000 UTC m=+103.713251806 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.971041 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.971263 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.970930 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.971830 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.972011 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.972366 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.973011 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.978466 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.981071 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.982878 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.986323 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.989410 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:33 crc kubenswrapper[4792]: I0301 09:09:33.992311 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.003758 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.006793 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.008791 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.009330 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.018728 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.018775 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.018821 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.018833 4792 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.018853 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.018866 4792 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.018860 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.018874 4792 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.018929 4792 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.018945 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.018959 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.018971 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.018984 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.018996 4792 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.018999 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019009 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019061 4792 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019073 4792 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019084 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019094 4792 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019104 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019116 4792 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019127 4792 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019137 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019147 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019158 4792 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019169 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019181 4792 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019192 4792 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019201 4792 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019209 4792 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019221 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019234 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019246 4792 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019257 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019267 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019275 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019283 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019291 4792 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019299 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019309 4792 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019318 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019326 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019337 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019349 4792 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019359 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019371 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019381 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019391 4792 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019399 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019408 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019417 4792 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019425 4792 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019433 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019441 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019450 4792 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019458 4792 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019466 4792 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019475 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019483 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019492 4792 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019501 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019510 4792 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019519 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019527 4792 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019536 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019544 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019556 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019565 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019577 4792 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019588 4792 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019598 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019609 4792 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019620 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019629 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019638 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019646 4792 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019655 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019663 4792 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019671 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019679 4792 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019687 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019695 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019706 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019716 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019727 4792 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019737 4792 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019746 4792 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019757 4792 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019765 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019774 4792 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019782 4792 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019790 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019798 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019806 4792 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019814 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019824 4792 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019832 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019840 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019848 4792 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019856 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019865 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019873 4792 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019882 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019890 4792 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019898 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019922 4792 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019930 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019938 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019948 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019958 4792 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019968 4792 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019976 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019984 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.019992 4792 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020001 4792 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020009 4792 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020019 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020029 4792 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020041 4792 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020053 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020062 4792 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020071 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020080 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020088 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020096 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020105 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020114 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020123 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020132 4792 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020142 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020152 4792 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020161 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020169 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020178 4792 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020186 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020196 4792 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020205 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020217 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020228 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020237 4792 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020245 4792 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020254 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020262 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020271 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020281 4792 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020290 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020300 4792 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020309 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020317 4792 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020325 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020334 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020341 4792 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020350 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020358 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020365 4792 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020374 4792 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020385 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020394 4792 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020402 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020410 4792 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020417 4792 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020426 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020433 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020442 4792 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020451 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020460 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020469 4792 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020478 4792 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020488 4792 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020495 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020503 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020512 4792 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020523 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020534 4792 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020543 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020552 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020561 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020570 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020579 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020587 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020596 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020605 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020613 4792 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020622 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020630 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020639 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020647 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020655 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020666 4792 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020674 4792 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020682 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020691 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020699 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020707 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.020716 4792 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.050334 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.050376 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.050387 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.050403 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.050413 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:34Z","lastTransitionTime":"2026-03-01T09:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.086838 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.099817 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 01 09:09:34 crc kubenswrapper[4792]: W0301 09:09:34.100841 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-12fd174335061da629c1882ca1f77f94c61a341aa7323ea0f73130b5e76e91c0 WatchSource:0}: Error finding container 12fd174335061da629c1882ca1f77f94c61a341aa7323ea0f73130b5e76e91c0: Status 404 returned error can't find the container with id 12fd174335061da629c1882ca1f77f94c61a341aa7323ea0f73130b5e76e91c0 Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.110165 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 01 09:09:34 crc kubenswrapper[4792]: W0301 09:09:34.111246 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-74b19e7a9fee01e9ac34eb744d89824e9cd74e491b82483d9e467d08db94da73 WatchSource:0}: Error finding container 74b19e7a9fee01e9ac34eb744d89824e9cd74e491b82483d9e467d08db94da73: Status 404 returned error can't find the container with id 74b19e7a9fee01e9ac34eb744d89824e9cd74e491b82483d9e467d08db94da73 Mar 01 09:09:34 crc kubenswrapper[4792]: W0301 09:09:34.122094 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-9b7eb2bcfda86474e39ff54de92c5f284cc19d1fed02f206b5e7c7210ed5f7f0 WatchSource:0}: Error finding container 9b7eb2bcfda86474e39ff54de92c5f284cc19d1fed02f206b5e7c7210ed5f7f0: Status 404 returned error can't find the container with id 9b7eb2bcfda86474e39ff54de92c5f284cc19d1fed02f206b5e7c7210ed5f7f0 Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.152570 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.152605 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.152616 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.152630 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.152639 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:34Z","lastTransitionTime":"2026-03-01T09:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.255343 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.255365 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.255373 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.255385 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.255393 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:34Z","lastTransitionTime":"2026-03-01T09:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.357676 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.357699 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.357708 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.357723 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.357733 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:34Z","lastTransitionTime":"2026-03-01T09:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.459484 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.459523 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.459534 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.459550 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.459561 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:34Z","lastTransitionTime":"2026-03-01T09:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.524016 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.524093 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.524116 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.524140 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:34 crc kubenswrapper[4792]: E0301 09:09:34.524171 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:09:35.524149966 +0000 UTC m=+104.766029163 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:09:34 crc kubenswrapper[4792]: E0301 09:09:34.524195 4792 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.524202 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:34 crc kubenswrapper[4792]: E0301 09:09:34.524229 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:35.524221328 +0000 UTC m=+104.766100525 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 01 09:09:34 crc kubenswrapper[4792]: E0301 09:09:34.524294 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 01 09:09:34 crc kubenswrapper[4792]: E0301 09:09:34.524309 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 01 09:09:34 crc kubenswrapper[4792]: E0301 09:09:34.524304 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 01 09:09:34 crc kubenswrapper[4792]: E0301 09:09:34.524353 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 01 09:09:34 crc kubenswrapper[4792]: E0301 09:09:34.524368 4792 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:34 crc kubenswrapper[4792]: E0301 09:09:34.524370 4792 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 01 09:09:34 crc kubenswrapper[4792]: E0301 09:09:34.524408 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:35.524399053 +0000 UTC m=+104.766278250 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 01 09:09:34 crc kubenswrapper[4792]: E0301 09:09:34.524321 4792 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:34 crc kubenswrapper[4792]: E0301 09:09:34.524423 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:35.524416213 +0000 UTC m=+104.766295410 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:34 crc kubenswrapper[4792]: E0301 09:09:34.524457 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:35.524438023 +0000 UTC m=+104.766317230 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.561150 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.561185 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.561196 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.561213 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.561227 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:34Z","lastTransitionTime":"2026-03-01T09:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.619254 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-bqszv"] Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.619524 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-zql8j"] Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.619683 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zql8j" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.619866 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.622166 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.622180 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.622530 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.623272 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.624107 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.625120 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.625373 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.625492 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.643769 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.653009 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.661946 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.663324 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.663360 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.663371 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.663385 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.663397 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:34Z","lastTransitionTime":"2026-03-01T09:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.669557 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.679444 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.688634 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.698114 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.707000 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.717230 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.726870 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjzhv\" (UniqueName: \"kubernetes.io/projected/0982a9bb-56d4-4e1c-86cb-76a4152de9ba-kube-api-access-hjzhv\") pod \"node-resolver-zql8j\" (UID: \"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\") " pod="openshift-dns/node-resolver-zql8j" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.726962 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0982a9bb-56d4-4e1c-86cb-76a4152de9ba-hosts-file\") pod \"node-resolver-zql8j\" (UID: \"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\") " pod="openshift-dns/node-resolver-zql8j" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.726999 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9105f6b0-6f16-47aa-8009-73736a90b765-rootfs\") pod \"machine-config-daemon-bqszv\" (UID: \"9105f6b0-6f16-47aa-8009-73736a90b765\") " pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.727030 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwr4p\" (UniqueName: \"kubernetes.io/projected/9105f6b0-6f16-47aa-8009-73736a90b765-kube-api-access-qwr4p\") pod \"machine-config-daemon-bqszv\" (UID: \"9105f6b0-6f16-47aa-8009-73736a90b765\") " pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.727128 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9105f6b0-6f16-47aa-8009-73736a90b765-proxy-tls\") pod \"machine-config-daemon-bqszv\" (UID: \"9105f6b0-6f16-47aa-8009-73736a90b765\") " pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.727164 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9105f6b0-6f16-47aa-8009-73736a90b765-mcd-auth-proxy-config\") pod \"machine-config-daemon-bqszv\" (UID: \"9105f6b0-6f16-47aa-8009-73736a90b765\") " pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.728720 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.737310 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.765640 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.765681 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.765694 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.765712 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.765727 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:34Z","lastTransitionTime":"2026-03-01T09:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.788966 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:34Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.807442 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:34Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.818696 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:34Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.828046 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9105f6b0-6f16-47aa-8009-73736a90b765-proxy-tls\") pod \"machine-config-daemon-bqszv\" (UID: \"9105f6b0-6f16-47aa-8009-73736a90b765\") " pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.828239 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9105f6b0-6f16-47aa-8009-73736a90b765-mcd-auth-proxy-config\") pod \"machine-config-daemon-bqszv\" (UID: \"9105f6b0-6f16-47aa-8009-73736a90b765\") " pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.828319 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjzhv\" (UniqueName: \"kubernetes.io/projected/0982a9bb-56d4-4e1c-86cb-76a4152de9ba-kube-api-access-hjzhv\") pod \"node-resolver-zql8j\" (UID: \"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\") " pod="openshift-dns/node-resolver-zql8j" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.828400 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0982a9bb-56d4-4e1c-86cb-76a4152de9ba-hosts-file\") pod \"node-resolver-zql8j\" (UID: \"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\") " pod="openshift-dns/node-resolver-zql8j" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.828477 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9105f6b0-6f16-47aa-8009-73736a90b765-rootfs\") pod \"machine-config-daemon-bqszv\" (UID: \"9105f6b0-6f16-47aa-8009-73736a90b765\") " pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.828652 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwr4p\" (UniqueName: \"kubernetes.io/projected/9105f6b0-6f16-47aa-8009-73736a90b765-kube-api-access-qwr4p\") pod \"machine-config-daemon-bqszv\" (UID: \"9105f6b0-6f16-47aa-8009-73736a90b765\") " pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.828557 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9105f6b0-6f16-47aa-8009-73736a90b765-rootfs\") pod \"machine-config-daemon-bqszv\" (UID: \"9105f6b0-6f16-47aa-8009-73736a90b765\") " pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.828536 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0982a9bb-56d4-4e1c-86cb-76a4152de9ba-hosts-file\") pod \"node-resolver-zql8j\" (UID: \"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\") " pod="openshift-dns/node-resolver-zql8j" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.829036 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9105f6b0-6f16-47aa-8009-73736a90b765-mcd-auth-proxy-config\") pod \"machine-config-daemon-bqszv\" (UID: \"9105f6b0-6f16-47aa-8009-73736a90b765\") " pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.829652 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:34Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.841835 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:34Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.853991 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:34Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.859574 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9105f6b0-6f16-47aa-8009-73736a90b765-proxy-tls\") pod \"machine-config-daemon-bqszv\" (UID: \"9105f6b0-6f16-47aa-8009-73736a90b765\") " pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.860192 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwr4p\" (UniqueName: \"kubernetes.io/projected/9105f6b0-6f16-47aa-8009-73736a90b765-kube-api-access-qwr4p\") pod \"machine-config-daemon-bqszv\" (UID: \"9105f6b0-6f16-47aa-8009-73736a90b765\") " pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.860287 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjzhv\" (UniqueName: \"kubernetes.io/projected/0982a9bb-56d4-4e1c-86cb-76a4152de9ba-kube-api-access-hjzhv\") pod \"node-resolver-zql8j\" (UID: \"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\") " pod="openshift-dns/node-resolver-zql8j" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.868108 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.868131 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.868140 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.868153 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.868161 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:34Z","lastTransitionTime":"2026-03-01T09:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.883899 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec"} Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.883944 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b"} Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.883953 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"74b19e7a9fee01e9ac34eb744d89824e9cd74e491b82483d9e467d08db94da73"} Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.887060 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce"} Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.887173 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"12fd174335061da629c1882ca1f77f94c61a341aa7323ea0f73130b5e76e91c0"} Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.888100 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"9b7eb2bcfda86474e39ff54de92c5f284cc19d1fed02f206b5e7c7210ed5f7f0"} Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.888477 4792 scope.go:117] "RemoveContainer" containerID="40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2" Mar 01 09:09:34 crc kubenswrapper[4792]: E0301 09:09:34.888595 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.896274 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:34Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.908589 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:34Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.920347 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:34Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.931045 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:34Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.933708 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zql8j" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.941859 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.944735 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:34Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:34 crc kubenswrapper[4792]: W0301 09:09:34.948168 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0982a9bb_56d4_4e1c_86cb_76a4152de9ba.slice/crio-ddbf2f829a1dc12dae3b569d0616d5fafc69ce7d261bb673e284e0d71f91480f WatchSource:0}: Error finding container ddbf2f829a1dc12dae3b569d0616d5fafc69ce7d261bb673e284e0d71f91480f: Status 404 returned error can't find the container with id ddbf2f829a1dc12dae3b569d0616d5fafc69ce7d261bb673e284e0d71f91480f Mar 01 09:09:34 crc kubenswrapper[4792]: W0301 09:09:34.952339 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9105f6b0_6f16_47aa_8009_73736a90b765.slice/crio-77ac8f0f45b53e6a6ab8f82d7543660fc5c9f8ea61d9196529ac31e5ab986a7c WatchSource:0}: Error finding container 77ac8f0f45b53e6a6ab8f82d7543660fc5c9f8ea61d9196529ac31e5ab986a7c: Status 404 returned error can't find the container with id 77ac8f0f45b53e6a6ab8f82d7543660fc5c9f8ea61d9196529ac31e5ab986a7c Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.958050 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:34Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.980201 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.980263 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.980282 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.980305 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.980331 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:34Z","lastTransitionTime":"2026-03-01T09:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.981448 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:34Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.988190 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-rbwx8"] Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.988807 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-pq28p"] Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.989042 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pq28p" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.989553 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.991019 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.991597 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.991645 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.991701 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.991706 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.991867 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.994135 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.994648 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7pp7m"] Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.995383 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.998793 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.999010 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.999082 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.999313 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.999451 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.999553 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 01 09:09:34 crc kubenswrapper[4792]: I0301 09:09:34.999585 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:34.999995 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:34Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.015086 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.029661 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.041667 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.050191 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.064251 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.079856 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.082282 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.082333 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.082362 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.082384 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.082399 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:35Z","lastTransitionTime":"2026-03-01T09:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.090863 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.108895 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.123781 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.130920 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-host-run-k8s-cni-cncf-io\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.130960 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-node-log\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.130977 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-log-socket\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.130993 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-run-ovn-kubernetes\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131029 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-run-netns\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131047 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-cni-netd\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131064 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-hostroot\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131079 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-multus-conf-dir\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131095 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/131582d9-bd96-444b-a597-ceb81e2b2085-os-release\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131112 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-var-lib-openvswitch\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131127 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-host-var-lib-cni-multus\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131142 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/131582d9-bd96-444b-a597-ceb81e2b2085-cni-binary-copy\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131166 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/131582d9-bd96-444b-a597-ceb81e2b2085-cnibin\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131181 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/131582d9-bd96-444b-a597-ceb81e2b2085-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131197 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-kubelet\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131212 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-env-overrides\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131229 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-system-cni-dir\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131293 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/131582d9-bd96-444b-a597-ceb81e2b2085-system-cni-dir\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131333 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-slash\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131350 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-run-systemd\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131365 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-ovnkube-config\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131384 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-multus-daemon-config\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131401 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-os-release\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131435 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxpbv\" (UniqueName: \"kubernetes.io/projected/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-kube-api-access-hxpbv\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131453 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvrws\" (UniqueName: \"kubernetes.io/projected/131582d9-bd96-444b-a597-ceb81e2b2085-kube-api-access-mvrws\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131484 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-systemd-units\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131518 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-cni-binary-copy\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131533 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-etc-openvswitch\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131551 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-run-ovn\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131571 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-etc-kubernetes\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131587 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-run-openvswitch\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131604 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131622 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-multus-cni-dir\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131647 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-host-var-lib-cni-bin\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131677 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-ovnkube-script-lib\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131693 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqvlh\" (UniqueName: \"kubernetes.io/projected/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-kube-api-access-kqvlh\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131725 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/131582d9-bd96-444b-a597-ceb81e2b2085-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131764 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-host-run-netns\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131782 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-host-run-multus-certs\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131797 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-cni-bin\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131852 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-ovn-node-metrics-cert\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131877 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-cnibin\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131939 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-multus-socket-dir-parent\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.131960 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-host-var-lib-kubelet\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.136957 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.154523 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.171959 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.184987 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.185320 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.185349 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.185367 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.185379 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:35Z","lastTransitionTime":"2026-03-01T09:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.192476 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.232606 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-cni-binary-copy\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.232824 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-etc-openvswitch\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.232964 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-run-ovn\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233092 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-etc-kubernetes\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233162 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-etc-kubernetes\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.232985 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-etc-openvswitch\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233014 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-run-ovn\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233328 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-run-openvswitch\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233410 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-run-openvswitch\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233495 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233557 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-cni-binary-copy\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233542 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233636 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-multus-cni-dir\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233694 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-host-var-lib-cni-bin\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233722 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-ovnkube-script-lib\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233753 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqvlh\" (UniqueName: \"kubernetes.io/projected/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-kube-api-access-kqvlh\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233779 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/131582d9-bd96-444b-a597-ceb81e2b2085-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233794 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-host-var-lib-cni-bin\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233799 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-host-run-netns\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233867 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-host-run-multus-certs\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233892 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-cni-bin\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234015 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-ovn-node-metrics-cert\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234041 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-cnibin\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234061 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-multus-socket-dir-parent\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233958 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-host-run-multus-certs\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233819 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-host-run-netns\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.233983 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-cni-bin\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234164 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-cnibin\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234195 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-host-var-lib-kubelet\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234222 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-host-run-k8s-cni-cncf-io\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234258 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-multus-socket-dir-parent\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234269 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-host-var-lib-kubelet\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234309 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-node-log\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234330 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-log-socket\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234379 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-host-run-k8s-cni-cncf-io\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234410 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-node-log\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234448 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-run-ovn-kubernetes\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234505 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-log-socket\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234513 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-run-ovn-kubernetes\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234543 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-run-netns\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234558 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-ovnkube-script-lib\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234563 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-cni-netd\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234594 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/131582d9-bd96-444b-a597-ceb81e2b2085-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234617 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-run-netns\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234627 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-hostroot\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234599 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-cni-netd\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234653 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-multus-conf-dir\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234674 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-hostroot\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234678 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/131582d9-bd96-444b-a597-ceb81e2b2085-os-release\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234715 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-var-lib-openvswitch\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234721 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/131582d9-bd96-444b-a597-ceb81e2b2085-os-release\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234728 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-multus-conf-dir\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234734 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-host-var-lib-cni-multus\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234754 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/131582d9-bd96-444b-a597-ceb81e2b2085-cni-binary-copy\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234774 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-host-var-lib-cni-multus\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234787 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/131582d9-bd96-444b-a597-ceb81e2b2085-cnibin\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234793 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-var-lib-openvswitch\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234806 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/131582d9-bd96-444b-a597-ceb81e2b2085-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234841 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-kubelet\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234848 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/131582d9-bd96-444b-a597-ceb81e2b2085-cnibin\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234871 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-env-overrides\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234886 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-kubelet\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.234983 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-system-cni-dir\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235018 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/131582d9-bd96-444b-a597-ceb81e2b2085-system-cni-dir\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235043 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-slash\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235044 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-system-cni-dir\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235067 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-run-systemd\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235092 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-ovnkube-config\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235114 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-slash\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235118 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-run-systemd\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235093 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/131582d9-bd96-444b-a597-ceb81e2b2085-system-cni-dir\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235118 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-multus-daemon-config\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235197 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-os-release\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235235 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxpbv\" (UniqueName: \"kubernetes.io/projected/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-kube-api-access-hxpbv\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235261 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvrws\" (UniqueName: \"kubernetes.io/projected/131582d9-bd96-444b-a597-ceb81e2b2085-kube-api-access-mvrws\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235280 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-systemd-units\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235328 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-systemd-units\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235416 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/131582d9-bd96-444b-a597-ceb81e2b2085-cni-binary-copy\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235487 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-os-release\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235639 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-ovnkube-config\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235716 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-multus-daemon-config\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235718 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/131582d9-bd96-444b-a597-ceb81e2b2085-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.235814 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-env-overrides\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.236392 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-multus-cni-dir\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.241640 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-ovn-node-metrics-cert\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.252445 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxpbv\" (UniqueName: \"kubernetes.io/projected/ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3-kube-api-access-hxpbv\") pod \"multus-pq28p\" (UID: \"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\") " pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.256715 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvrws\" (UniqueName: \"kubernetes.io/projected/131582d9-bd96-444b-a597-ceb81e2b2085-kube-api-access-mvrws\") pod \"multus-additional-cni-plugins-rbwx8\" (UID: \"131582d9-bd96-444b-a597-ceb81e2b2085\") " pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.287802 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.288084 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.288249 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.288405 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.288568 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:35Z","lastTransitionTime":"2026-03-01T09:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.356311 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pq28p" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.360295 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqvlh\" (UniqueName: \"kubernetes.io/projected/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-kube-api-access-kqvlh\") pod \"ovnkube-node-7pp7m\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.363020 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" Mar 01 09:09:35 crc kubenswrapper[4792]: W0301 09:09:35.367017 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad44dee2_f99e_4e77_bc6a_2ab7f39eddf3.slice/crio-bf0d8f1e23790f6324b6b66e068f8d94854a2a7119fd193c81aa2574ed9f7de8 WatchSource:0}: Error finding container bf0d8f1e23790f6324b6b66e068f8d94854a2a7119fd193c81aa2574ed9f7de8: Status 404 returned error can't find the container with id bf0d8f1e23790f6324b6b66e068f8d94854a2a7119fd193c81aa2574ed9f7de8 Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.369840 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:35 crc kubenswrapper[4792]: W0301 09:09:35.376205 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod131582d9_bd96_444b_a597_ceb81e2b2085.slice/crio-1ce7de19ad7ca9fbffde845a67c6237dfb5fe98b6b1fff71cd698df905e9d226 WatchSource:0}: Error finding container 1ce7de19ad7ca9fbffde845a67c6237dfb5fe98b6b1fff71cd698df905e9d226: Status 404 returned error can't find the container with id 1ce7de19ad7ca9fbffde845a67c6237dfb5fe98b6b1fff71cd698df905e9d226 Mar 01 09:09:35 crc kubenswrapper[4792]: W0301 09:09:35.384179 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2bd7bac_21cf_4657_ab84_68a14f99f8f0.slice/crio-50a5a13eb582ab0332ca2180448f447182fdf584f81ede3ccf1a5ef0fe6bed57 WatchSource:0}: Error finding container 50a5a13eb582ab0332ca2180448f447182fdf584f81ede3ccf1a5ef0fe6bed57: Status 404 returned error can't find the container with id 50a5a13eb582ab0332ca2180448f447182fdf584f81ede3ccf1a5ef0fe6bed57 Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.390647 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.390695 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.390706 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.390720 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.390729 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:35Z","lastTransitionTime":"2026-03-01T09:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.410028 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.410066 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.410073 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:35 crc kubenswrapper[4792]: E0301 09:09:35.410133 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:09:35 crc kubenswrapper[4792]: E0301 09:09:35.410269 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:09:35 crc kubenswrapper[4792]: E0301 09:09:35.410353 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.414635 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.415308 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.417404 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.418455 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.419620 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.420608 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.422148 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.422705 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.424087 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.424678 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.425238 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.426414 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.426955 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.427843 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.428410 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.429399 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.430225 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.430628 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.431550 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.432333 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.432827 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.433878 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.434307 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.435482 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.436007 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.437010 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.437630 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.438676 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.439412 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.440294 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.441460 4792 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.441562 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.443761 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.445341 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.445807 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.447409 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.448558 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.449163 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.450248 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.450921 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.451788 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.452750 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.454005 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.454631 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.455537 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.456068 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.456994 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.457669 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.458486 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.459113 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.459995 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.460537 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.461095 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.462342 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.492918 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.492944 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.492952 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.492964 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.492973 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:35Z","lastTransitionTime":"2026-03-01T09:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.537870 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.537989 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.538016 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:35 crc kubenswrapper[4792]: E0301 09:09:35.538046 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:09:37.538014711 +0000 UTC m=+106.779893918 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.538092 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.538163 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:35 crc kubenswrapper[4792]: E0301 09:09:35.538099 4792 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 01 09:09:35 crc kubenswrapper[4792]: E0301 09:09:35.538233 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:37.538214716 +0000 UTC m=+106.780094133 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 01 09:09:35 crc kubenswrapper[4792]: E0301 09:09:35.538350 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 01 09:09:35 crc kubenswrapper[4792]: E0301 09:09:35.538150 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 01 09:09:35 crc kubenswrapper[4792]: E0301 09:09:35.538401 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 01 09:09:35 crc kubenswrapper[4792]: E0301 09:09:35.538417 4792 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:35 crc kubenswrapper[4792]: E0301 09:09:35.538175 4792 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 01 09:09:35 crc kubenswrapper[4792]: E0301 09:09:35.538374 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 01 09:09:35 crc kubenswrapper[4792]: E0301 09:09:35.538523 4792 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:35 crc kubenswrapper[4792]: E0301 09:09:35.538468 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:37.538454432 +0000 UTC m=+106.780333859 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:35 crc kubenswrapper[4792]: E0301 09:09:35.538570 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:37.538558675 +0000 UTC m=+106.780438092 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 01 09:09:35 crc kubenswrapper[4792]: E0301 09:09:35.538586 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:37.538577425 +0000 UTC m=+106.780456882 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.595026 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.595068 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.595077 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.595093 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.595104 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:35Z","lastTransitionTime":"2026-03-01T09:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.697377 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.697422 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.697430 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.697442 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.697451 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:35Z","lastTransitionTime":"2026-03-01T09:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.799916 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.799949 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.799958 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.799972 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.799981 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:35Z","lastTransitionTime":"2026-03-01T09:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.891846 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.891897 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.891923 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"77ac8f0f45b53e6a6ab8f82d7543660fc5c9f8ea61d9196529ac31e5ab986a7c"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.893218 4792 generic.go:334] "Generic (PLEG): container finished" podID="131582d9-bd96-444b-a597-ceb81e2b2085" containerID="70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0" exitCode=0 Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.893276 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" event={"ID":"131582d9-bd96-444b-a597-ceb81e2b2085","Type":"ContainerDied","Data":"70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.893293 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" event={"ID":"131582d9-bd96-444b-a597-ceb81e2b2085","Type":"ContainerStarted","Data":"1ce7de19ad7ca9fbffde845a67c6237dfb5fe98b6b1fff71cd698df905e9d226"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.894814 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pq28p" event={"ID":"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3","Type":"ContainerStarted","Data":"239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.894898 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pq28p" event={"ID":"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3","Type":"ContainerStarted","Data":"bf0d8f1e23790f6324b6b66e068f8d94854a2a7119fd193c81aa2574ed9f7de8"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.896332 4792 generic.go:334] "Generic (PLEG): container finished" podID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerID="d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6" exitCode=0 Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.896419 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerDied","Data":"d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.896478 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerStarted","Data":"50a5a13eb582ab0332ca2180448f447182fdf584f81ede3ccf1a5ef0fe6bed57"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.897920 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zql8j" event={"ID":"0982a9bb-56d4-4e1c-86cb-76a4152de9ba","Type":"ContainerStarted","Data":"bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.904044 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zql8j" event={"ID":"0982a9bb-56d4-4e1c-86cb-76a4152de9ba","Type":"ContainerStarted","Data":"ddbf2f829a1dc12dae3b569d0616d5fafc69ce7d261bb673e284e0d71f91480f"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.906761 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.906791 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.906802 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.906817 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.906828 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:35Z","lastTransitionTime":"2026-03-01T09:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.920477 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.937835 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.949416 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.971955 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.981884 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:35 crc kubenswrapper[4792]: I0301 09:09:35.994863 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.009507 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.009542 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.009562 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.009588 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.009600 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:36Z","lastTransitionTime":"2026-03-01T09:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.011491 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.034984 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.054902 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.071895 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.084196 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.097658 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.110313 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.111895 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.111942 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.111950 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.111969 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.111978 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:36Z","lastTransitionTime":"2026-03-01T09:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.124618 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.140163 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.152575 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.167551 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.184401 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.199272 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.211373 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.215629 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.215663 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.215672 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.215687 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.215701 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:36Z","lastTransitionTime":"2026-03-01T09:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.223948 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.235275 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.247425 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.260518 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.317621 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.317660 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.317668 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.317680 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.317689 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:36Z","lastTransitionTime":"2026-03-01T09:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.420206 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.420516 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.420526 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.420538 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.420549 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:36Z","lastTransitionTime":"2026-03-01T09:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.522581 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.522624 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.522637 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.522651 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.522662 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:36Z","lastTransitionTime":"2026-03-01T09:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.632735 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.632767 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.632778 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.632791 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.632801 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:36Z","lastTransitionTime":"2026-03-01T09:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.735740 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.735765 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.735773 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.735789 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.735797 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:36Z","lastTransitionTime":"2026-03-01T09:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.838610 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.838973 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.838988 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.839004 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.839015 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:36Z","lastTransitionTime":"2026-03-01T09:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.902810 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" event={"ID":"131582d9-bd96-444b-a597-ceb81e2b2085","Type":"ContainerStarted","Data":"069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898"} Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.907664 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerStarted","Data":"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8"} Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.907710 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerStarted","Data":"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3"} Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.907720 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerStarted","Data":"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a"} Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.907730 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerStarted","Data":"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948"} Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.909209 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c"} Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.919618 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.933132 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.941378 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.941414 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.941423 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.941444 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.941455 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:36Z","lastTransitionTime":"2026-03-01T09:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.949009 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.961766 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.973773 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.984814 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:36 crc kubenswrapper[4792]: I0301 09:09:36.996249 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:36Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.009074 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.028519 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.043350 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.043393 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.043404 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.043420 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.043430 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:37Z","lastTransitionTime":"2026-03-01T09:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.046177 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.060180 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.074708 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.087738 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.098944 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.113182 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.126820 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.142937 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.145430 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.145461 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.145470 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.145495 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.145505 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:37Z","lastTransitionTime":"2026-03-01T09:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.154118 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.165857 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.178441 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.193578 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.204040 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.214885 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.232383 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.247476 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.247514 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.247525 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.247539 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.247550 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:37Z","lastTransitionTime":"2026-03-01T09:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.349901 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.349955 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.349965 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.349982 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.349992 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:37Z","lastTransitionTime":"2026-03-01T09:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.407985 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.408022 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.407984 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:37 crc kubenswrapper[4792]: E0301 09:09:37.408109 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:09:37 crc kubenswrapper[4792]: E0301 09:09:37.408175 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:09:37 crc kubenswrapper[4792]: E0301 09:09:37.408247 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.452588 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.452617 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.452625 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.452638 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.452647 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:37Z","lastTransitionTime":"2026-03-01T09:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.554888 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.554942 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.554954 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.554971 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.554987 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:37Z","lastTransitionTime":"2026-03-01T09:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.560316 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.560396 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:37 crc kubenswrapper[4792]: E0301 09:09:37.560449 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:09:41.560433875 +0000 UTC m=+110.802313072 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.560485 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.560504 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:37 crc kubenswrapper[4792]: E0301 09:09:37.560507 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 01 09:09:37 crc kubenswrapper[4792]: E0301 09:09:37.560523 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.560528 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:37 crc kubenswrapper[4792]: E0301 09:09:37.560534 4792 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:37 crc kubenswrapper[4792]: E0301 09:09:37.560568 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:41.560558349 +0000 UTC m=+110.802437546 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:37 crc kubenswrapper[4792]: E0301 09:09:37.560595 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 01 09:09:37 crc kubenswrapper[4792]: E0301 09:09:37.560606 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 01 09:09:37 crc kubenswrapper[4792]: E0301 09:09:37.560614 4792 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:37 crc kubenswrapper[4792]: E0301 09:09:37.560639 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:41.56063119 +0000 UTC m=+110.802510377 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:37 crc kubenswrapper[4792]: E0301 09:09:37.560666 4792 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 01 09:09:37 crc kubenswrapper[4792]: E0301 09:09:37.560685 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:41.560680052 +0000 UTC m=+110.802559249 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 01 09:09:37 crc kubenswrapper[4792]: E0301 09:09:37.560677 4792 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 01 09:09:37 crc kubenswrapper[4792]: E0301 09:09:37.560800 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:41.560770564 +0000 UTC m=+110.802649791 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.661538 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.661574 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.661586 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.661601 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.661611 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:37Z","lastTransitionTime":"2026-03-01T09:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.763567 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.763613 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.763623 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.763637 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.763646 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:37Z","lastTransitionTime":"2026-03-01T09:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.866174 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.866211 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.866223 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.866239 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.866250 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:37Z","lastTransitionTime":"2026-03-01T09:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.899489 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-4gj45"] Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.900070 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4gj45" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.901867 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.902134 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.903441 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.907799 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.914719 4792 generic.go:334] "Generic (PLEG): container finished" podID="131582d9-bd96-444b-a597-ceb81e2b2085" containerID="069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898" exitCode=0 Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.914806 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" event={"ID":"131582d9-bd96-444b-a597-ceb81e2b2085","Type":"ContainerDied","Data":"069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898"} Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.924724 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerStarted","Data":"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e"} Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.924760 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerStarted","Data":"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe"} Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.931262 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.953614 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.969817 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.969847 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.969857 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.969893 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.969919 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:37Z","lastTransitionTime":"2026-03-01T09:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.973358 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:37 crc kubenswrapper[4792]: I0301 09:09:37.991743 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.005009 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.016618 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.031251 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.044031 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.054507 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.066134 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5923c286-5572-46c3-bed5-79cd67efc945-host\") pod \"node-ca-4gj45\" (UID: \"5923c286-5572-46c3-bed5-79cd67efc945\") " pod="openshift-image-registry/node-ca-4gj45" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.066263 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntx44\" (UniqueName: \"kubernetes.io/projected/5923c286-5572-46c3-bed5-79cd67efc945-kube-api-access-ntx44\") pod \"node-ca-4gj45\" (UID: \"5923c286-5572-46c3-bed5-79cd67efc945\") " pod="openshift-image-registry/node-ca-4gj45" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.066297 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5923c286-5572-46c3-bed5-79cd67efc945-serviceca\") pod \"node-ca-4gj45\" (UID: \"5923c286-5572-46c3-bed5-79cd67efc945\") " pod="openshift-image-registry/node-ca-4gj45" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.069743 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.072073 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.072096 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.072105 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.072122 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.072131 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:38Z","lastTransitionTime":"2026-03-01T09:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.083942 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.097372 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.115861 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.127751 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.138137 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.155072 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.166738 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5923c286-5572-46c3-bed5-79cd67efc945-host\") pod \"node-ca-4gj45\" (UID: \"5923c286-5572-46c3-bed5-79cd67efc945\") " pod="openshift-image-registry/node-ca-4gj45" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.166802 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntx44\" (UniqueName: \"kubernetes.io/projected/5923c286-5572-46c3-bed5-79cd67efc945-kube-api-access-ntx44\") pod \"node-ca-4gj45\" (UID: \"5923c286-5572-46c3-bed5-79cd67efc945\") " pod="openshift-image-registry/node-ca-4gj45" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.166822 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5923c286-5572-46c3-bed5-79cd67efc945-serviceca\") pod \"node-ca-4gj45\" (UID: \"5923c286-5572-46c3-bed5-79cd67efc945\") " pod="openshift-image-registry/node-ca-4gj45" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.166936 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5923c286-5572-46c3-bed5-79cd67efc945-host\") pod \"node-ca-4gj45\" (UID: \"5923c286-5572-46c3-bed5-79cd67efc945\") " pod="openshift-image-registry/node-ca-4gj45" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.167633 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5923c286-5572-46c3-bed5-79cd67efc945-serviceca\") pod \"node-ca-4gj45\" (UID: \"5923c286-5572-46c3-bed5-79cd67efc945\") " pod="openshift-image-registry/node-ca-4gj45" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.169431 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.173718 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.173747 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.173760 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.173775 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.173789 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:38Z","lastTransitionTime":"2026-03-01T09:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.182940 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.187703 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntx44\" (UniqueName: \"kubernetes.io/projected/5923c286-5572-46c3-bed5-79cd67efc945-kube-api-access-ntx44\") pod \"node-ca-4gj45\" (UID: \"5923c286-5572-46c3-bed5-79cd67efc945\") " pod="openshift-image-registry/node-ca-4gj45" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.197037 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.209365 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.220757 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.228707 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4gj45" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.232437 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.243711 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.260205 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.274569 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.276408 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.276448 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.276462 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.276477 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.276487 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:38Z","lastTransitionTime":"2026-03-01T09:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.284016 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.378695 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.378729 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.378737 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.378756 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.378765 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:38Z","lastTransitionTime":"2026-03-01T09:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.484992 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.485034 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.485045 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.485061 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.485072 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:38Z","lastTransitionTime":"2026-03-01T09:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.588131 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.588206 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.588220 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.588247 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.588262 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:38Z","lastTransitionTime":"2026-03-01T09:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.691409 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.691691 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.691789 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.691881 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.691998 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:38Z","lastTransitionTime":"2026-03-01T09:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.794825 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.795493 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.795598 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.795673 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.795744 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:38Z","lastTransitionTime":"2026-03-01T09:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.898527 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.898590 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.898613 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.898643 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.898663 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:38Z","lastTransitionTime":"2026-03-01T09:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.928331 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4gj45" event={"ID":"5923c286-5572-46c3-bed5-79cd67efc945","Type":"ContainerStarted","Data":"c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4"} Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.928372 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4gj45" event={"ID":"5923c286-5572-46c3-bed5-79cd67efc945","Type":"ContainerStarted","Data":"b0a9dab35e0ef28411206f88fa21e6ed13237f334bc482e178c18d5655cc7d3a"} Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.930711 4792 generic.go:334] "Generic (PLEG): container finished" podID="131582d9-bd96-444b-a597-ceb81e2b2085" containerID="bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43" exitCode=0 Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.930872 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" event={"ID":"131582d9-bd96-444b-a597-ceb81e2b2085","Type":"ContainerDied","Data":"bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43"} Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.951542 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.970810 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:38 crc kubenswrapper[4792]: I0301 09:09:38.985176 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:38Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.001297 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.001338 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.001349 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.001366 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.001376 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:39Z","lastTransitionTime":"2026-03-01T09:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.004848 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.017542 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.033410 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.046589 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.058280 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.066183 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.066213 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.066222 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.066235 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.066254 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:39Z","lastTransitionTime":"2026-03-01T09:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.072670 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: E0301 09:09:39.080875 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.084282 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.084306 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.084314 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.084328 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.084337 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:39Z","lastTransitionTime":"2026-03-01T09:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:39 crc kubenswrapper[4792]: E0301 09:09:39.097835 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.098279 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.101032 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.101071 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.101084 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.101104 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.101117 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:39Z","lastTransitionTime":"2026-03-01T09:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.111176 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: E0301 09:09:39.112738 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.119446 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.119476 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.119490 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.119505 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.119518 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:39Z","lastTransitionTime":"2026-03-01T09:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.126435 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: E0301 09:09:39.131954 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.135769 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.135803 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.135815 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.135831 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.135872 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:39Z","lastTransitionTime":"2026-03-01T09:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.143654 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: E0301 09:09:39.153047 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: E0301 09:09:39.153184 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.154595 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.154614 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.154623 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.154636 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.154645 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:39Z","lastTransitionTime":"2026-03-01T09:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.159686 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.179124 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.198828 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.216917 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.229814 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.245410 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.256666 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.256702 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.256711 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.256725 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.256734 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:39Z","lastTransitionTime":"2026-03-01T09:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.259164 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.272761 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.284510 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.293982 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.304291 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.315234 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.336757 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.359610 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.359652 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.359666 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.359684 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.359696 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:39Z","lastTransitionTime":"2026-03-01T09:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.408719 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.408754 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.408791 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:39 crc kubenswrapper[4792]: E0301 09:09:39.408850 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:09:39 crc kubenswrapper[4792]: E0301 09:09:39.408942 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:09:39 crc kubenswrapper[4792]: E0301 09:09:39.409102 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.462484 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.462517 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.462525 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.462539 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.462584 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:39Z","lastTransitionTime":"2026-03-01T09:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.565873 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.566055 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.566079 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.566104 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.566122 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:39Z","lastTransitionTime":"2026-03-01T09:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.668850 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.668882 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.668891 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.668930 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.668945 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:39Z","lastTransitionTime":"2026-03-01T09:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.770597 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.770704 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.770712 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.770725 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.770733 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:39Z","lastTransitionTime":"2026-03-01T09:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.873791 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.873839 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.873854 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.873875 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.873885 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:39Z","lastTransitionTime":"2026-03-01T09:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.937657 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerStarted","Data":"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0"} Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.940034 4792 generic.go:334] "Generic (PLEG): container finished" podID="131582d9-bd96-444b-a597-ceb81e2b2085" containerID="5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087" exitCode=0 Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.940076 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" event={"ID":"131582d9-bd96-444b-a597-ceb81e2b2085","Type":"ContainerDied","Data":"5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087"} Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.956154 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.966854 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.979448 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.979495 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.979507 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.979527 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.979540 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:39Z","lastTransitionTime":"2026-03-01T09:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.982640 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:39 crc kubenswrapper[4792]: I0301 09:09:39.995381 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:39Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.004673 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:40Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.016397 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:40Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.033432 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:40Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.074575 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:40Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.085363 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.085401 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.085410 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.085426 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.085436 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:40Z","lastTransitionTime":"2026-03-01T09:09:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.113042 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:40Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.135594 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:40Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.150235 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:40Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.162396 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:40Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.175558 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:40Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.188054 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.188101 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.188114 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.188130 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.188141 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:40Z","lastTransitionTime":"2026-03-01T09:09:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.293050 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.293095 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.293108 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.293126 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.293138 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:40Z","lastTransitionTime":"2026-03-01T09:09:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.399629 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.399685 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.399701 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.399719 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.399736 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:40Z","lastTransitionTime":"2026-03-01T09:09:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.502684 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.502857 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.502943 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.502976 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.502997 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:40Z","lastTransitionTime":"2026-03-01T09:09:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.605131 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.605658 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.605743 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.605851 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.605945 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:40Z","lastTransitionTime":"2026-03-01T09:09:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.708497 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.708530 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.708539 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.708551 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.708560 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:40Z","lastTransitionTime":"2026-03-01T09:09:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.811552 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.811606 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.811624 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.811646 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.811663 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:40Z","lastTransitionTime":"2026-03-01T09:09:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.914208 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.914265 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.914282 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.914307 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.914325 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:40Z","lastTransitionTime":"2026-03-01T09:09:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.946333 4792 generic.go:334] "Generic (PLEG): container finished" podID="131582d9-bd96-444b-a597-ceb81e2b2085" containerID="ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301" exitCode=0 Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.946595 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" event={"ID":"131582d9-bd96-444b-a597-ceb81e2b2085","Type":"ContainerDied","Data":"ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301"} Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.959633 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:40Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.971138 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:40Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.981376 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:40Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:40 crc kubenswrapper[4792]: I0301 09:09:40.999517 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:40Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.017099 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.017280 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.017307 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.017318 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.017333 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.017346 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:41Z","lastTransitionTime":"2026-03-01T09:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.017760 4792 scope.go:117] "RemoveContainer" containerID="40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2" Mar 01 09:09:41 crc kubenswrapper[4792]: E0301 09:09:41.017971 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.021275 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.032722 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.049867 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.068159 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.082750 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.096408 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.116947 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.119318 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.119355 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.119369 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.119387 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.119398 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:41Z","lastTransitionTime":"2026-03-01T09:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.137732 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.151495 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.222073 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.222108 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.222119 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.222140 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.222153 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:41Z","lastTransitionTime":"2026-03-01T09:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.324791 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.324829 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.324838 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.324853 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.324861 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:41Z","lastTransitionTime":"2026-03-01T09:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.408327 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.408344 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.408406 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:41 crc kubenswrapper[4792]: E0301 09:09:41.408533 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:09:41 crc kubenswrapper[4792]: E0301 09:09:41.408783 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:09:41 crc kubenswrapper[4792]: E0301 09:09:41.408824 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.420425 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.427178 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.427215 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.427226 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.427242 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.427252 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:41Z","lastTransitionTime":"2026-03-01T09:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.430771 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.449897 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.464057 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.477581 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.492933 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.509655 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.523790 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.529009 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.529039 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.529054 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.529073 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.529085 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:41Z","lastTransitionTime":"2026-03-01T09:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.539961 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.556841 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.570128 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.585302 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.599151 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.604503 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.604606 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.604644 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:41 crc kubenswrapper[4792]: E0301 09:09:41.604658 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:09:49.604639039 +0000 UTC m=+118.846518236 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.604705 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:41 crc kubenswrapper[4792]: E0301 09:09:41.604753 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 01 09:09:41 crc kubenswrapper[4792]: E0301 09:09:41.604765 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 01 09:09:41 crc kubenswrapper[4792]: E0301 09:09:41.604776 4792 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:41 crc kubenswrapper[4792]: E0301 09:09:41.604787 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 01 09:09:41 crc kubenswrapper[4792]: E0301 09:09:41.604806 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 01 09:09:41 crc kubenswrapper[4792]: E0301 09:09:41.604820 4792 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:41 crc kubenswrapper[4792]: E0301 09:09:41.604830 4792 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 01 09:09:41 crc kubenswrapper[4792]: E0301 09:09:41.604809 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:49.604801674 +0000 UTC m=+118.846680871 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:41 crc kubenswrapper[4792]: E0301 09:09:41.604874 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:49.604861165 +0000 UTC m=+118.846740362 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 01 09:09:41 crc kubenswrapper[4792]: E0301 09:09:41.604893 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:49.604884916 +0000 UTC m=+118.846764113 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.604737 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:41 crc kubenswrapper[4792]: E0301 09:09:41.605043 4792 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 01 09:09:41 crc kubenswrapper[4792]: E0301 09:09:41.605241 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:49.605210244 +0000 UTC m=+118.847089441 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.631451 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.631504 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.631518 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.631538 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.631550 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:41Z","lastTransitionTime":"2026-03-01T09:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.734569 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.734625 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.734636 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.734659 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.734675 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:41Z","lastTransitionTime":"2026-03-01T09:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.837049 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.837089 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.837100 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.837116 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.837126 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:41Z","lastTransitionTime":"2026-03-01T09:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.939770 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.939799 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.939809 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.939822 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.939831 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:41Z","lastTransitionTime":"2026-03-01T09:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.952399 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerStarted","Data":"26263abc51f572d7ba71366a0e5f432981bad7e073a91d6fc6bfc0ffa2ca9f23"} Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.952985 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.956355 4792 generic.go:334] "Generic (PLEG): container finished" podID="131582d9-bd96-444b-a597-ceb81e2b2085" containerID="9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f" exitCode=0 Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.956383 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" event={"ID":"131582d9-bd96-444b-a597-ceb81e2b2085","Type":"ContainerDied","Data":"9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f"} Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.967524 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.979342 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.989610 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:41 crc kubenswrapper[4792]: I0301 09:09:41.994167 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:41Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.007755 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.020560 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.034874 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.042075 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.042101 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.042109 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.042121 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.042129 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:42Z","lastTransitionTime":"2026-03-01T09:09:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.051362 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.063561 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.079740 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26263abc51f572d7ba71366a0e5f432981bad7e073a91d6fc6bfc0ffa2ca9f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.093281 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.105788 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.116866 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.130478 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.144731 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.144772 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.144800 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.144815 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.144826 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:42Z","lastTransitionTime":"2026-03-01T09:09:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.144729 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.154452 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.163780 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.178762 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.198145 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.210491 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.228495 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.240473 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.249067 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.249113 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.249125 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.249142 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.249153 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:42Z","lastTransitionTime":"2026-03-01T09:09:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.260133 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26263abc51f572d7ba71366a0e5f432981bad7e073a91d6fc6bfc0ffa2ca9f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.272730 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.286408 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.304156 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.320140 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.351506 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.351533 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.351541 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.351554 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.351563 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:42Z","lastTransitionTime":"2026-03-01T09:09:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.453815 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.453842 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.453850 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.453863 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.453873 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:42Z","lastTransitionTime":"2026-03-01T09:09:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.555878 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.555923 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.555932 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.555944 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.555954 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:42Z","lastTransitionTime":"2026-03-01T09:09:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.658365 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.658393 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.658403 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.658415 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.658424 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:42Z","lastTransitionTime":"2026-03-01T09:09:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.760733 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.760758 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.760767 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.760778 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.760787 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:42Z","lastTransitionTime":"2026-03-01T09:09:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.862812 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.862859 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.862869 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.862923 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.862934 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:42Z","lastTransitionTime":"2026-03-01T09:09:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.962698 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" event={"ID":"131582d9-bd96-444b-a597-ceb81e2b2085","Type":"ContainerStarted","Data":"67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62"} Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.963620 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.963665 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.964338 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.964363 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.964373 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.964386 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.964411 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:42Z","lastTransitionTime":"2026-03-01T09:09:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.978661 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.984034 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:42 crc kubenswrapper[4792]: I0301 09:09:42.992062 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:42Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.004981 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.018244 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.034434 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.048523 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.063255 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.066691 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.066771 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.066792 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.066843 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.066873 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:43Z","lastTransitionTime":"2026-03-01T09:09:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.078274 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.090658 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.103651 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.126588 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26263abc51f572d7ba71366a0e5f432981bad7e073a91d6fc6bfc0ffa2ca9f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.141109 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.152805 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.167809 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.168436 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.168492 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.168506 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.168523 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.168556 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:43Z","lastTransitionTime":"2026-03-01T09:09:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.181501 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.192926 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.204556 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.214308 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.226550 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.240441 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.254125 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.265268 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.270065 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.270086 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.270094 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.270107 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.270115 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:43Z","lastTransitionTime":"2026-03-01T09:09:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.275073 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.286437 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.301114 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.318800 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26263abc51f572d7ba71366a0e5f432981bad7e073a91d6fc6bfc0ffa2ca9f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:43Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.372385 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.372432 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.372441 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.372454 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.372462 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:43Z","lastTransitionTime":"2026-03-01T09:09:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.410980 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:43 crc kubenswrapper[4792]: E0301 09:09:43.412587 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.412935 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:43 crc kubenswrapper[4792]: E0301 09:09:43.412987 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.413021 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:43 crc kubenswrapper[4792]: E0301 09:09:43.413058 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.475487 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.475521 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.475531 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.475546 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.475555 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:43Z","lastTransitionTime":"2026-03-01T09:09:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.578236 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.578279 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.578290 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.578306 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.578317 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:43Z","lastTransitionTime":"2026-03-01T09:09:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.680413 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.680690 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.680863 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.681014 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.681138 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:43Z","lastTransitionTime":"2026-03-01T09:09:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.783747 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.783792 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.783801 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.783817 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.783828 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:43Z","lastTransitionTime":"2026-03-01T09:09:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.885850 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.885943 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.885963 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.885989 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.886009 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:43Z","lastTransitionTime":"2026-03-01T09:09:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.989141 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.989220 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.989230 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.989245 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:43 crc kubenswrapper[4792]: I0301 09:09:43.989255 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:43Z","lastTransitionTime":"2026-03-01T09:09:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.091059 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.091086 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.091094 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.091108 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.091117 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:44Z","lastTransitionTime":"2026-03-01T09:09:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.193438 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.193472 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.193481 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.193494 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.193503 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:44Z","lastTransitionTime":"2026-03-01T09:09:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.296038 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.296077 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.296089 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.296610 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.296629 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:44Z","lastTransitionTime":"2026-03-01T09:09:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.399208 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.399250 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.399260 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.399274 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.399283 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:44Z","lastTransitionTime":"2026-03-01T09:09:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.501497 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.501540 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.501551 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.501575 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.501590 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:44Z","lastTransitionTime":"2026-03-01T09:09:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.603929 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.603967 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.603975 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.603989 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.603998 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:44Z","lastTransitionTime":"2026-03-01T09:09:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.705998 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.706035 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.706049 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.706064 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.706074 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:44Z","lastTransitionTime":"2026-03-01T09:09:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.808739 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.808792 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.808810 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.808845 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.808857 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:44Z","lastTransitionTime":"2026-03-01T09:09:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.911387 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.911425 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.911436 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.911451 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.911462 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:44Z","lastTransitionTime":"2026-03-01T09:09:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.970037 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovnkube-controller/0.log" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.973463 4792 generic.go:334] "Generic (PLEG): container finished" podID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerID="26263abc51f572d7ba71366a0e5f432981bad7e073a91d6fc6bfc0ffa2ca9f23" exitCode=1 Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.973520 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerDied","Data":"26263abc51f572d7ba71366a0e5f432981bad7e073a91d6fc6bfc0ffa2ca9f23"} Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.974830 4792 scope.go:117] "RemoveContainer" containerID="26263abc51f572d7ba71366a0e5f432981bad7e073a91d6fc6bfc0ffa2ca9f23" Mar 01 09:09:44 crc kubenswrapper[4792]: I0301 09:09:44.989484 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:44Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.000806 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:44Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.013743 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.014004 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.014013 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.014027 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.014037 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:45Z","lastTransitionTime":"2026-03-01T09:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.024076 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26263abc51f572d7ba71366a0e5f432981bad7e073a91d6fc6bfc0ffa2ca9f23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263abc51f572d7ba71366a0e5f432981bad7e073a91d6fc6bfc0ffa2ca9f23\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:09:44Z\\\",\\\"message\\\":\\\"nPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0301 09:09:44.646724 6493 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0301 09:09:44.646811 6493 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0301 09:09:44.646980 6493 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0301 09:09:44.646998 6493 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0301 09:09:44.647121 6493 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0301 09:09:44.647665 6493 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0301 09:09:44.647688 6493 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0301 09:09:44.647713 6493 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0301 09:09:44.647733 6493 factory.go:656] Stopping watch factory\\\\nI0301 09:09:44.647748 6493 ovnkube.go:599] Stopped ovnkube\\\\nI0301 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.064199 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.089512 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.114240 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.116178 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.116213 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.116223 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.116238 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.116249 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:45Z","lastTransitionTime":"2026-03-01T09:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.134074 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.152960 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.162762 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.176117 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.188060 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.200823 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.216239 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.218841 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.218867 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.218876 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.218891 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.218900 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:45Z","lastTransitionTime":"2026-03-01T09:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.321197 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.321230 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.321243 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.321259 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.321269 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:45Z","lastTransitionTime":"2026-03-01T09:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.409628 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:45 crc kubenswrapper[4792]: E0301 09:09:45.409718 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.410041 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:45 crc kubenswrapper[4792]: E0301 09:09:45.410095 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.410133 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:45 crc kubenswrapper[4792]: E0301 09:09:45.410170 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.437808 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.437943 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.438028 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.438148 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.438240 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:45Z","lastTransitionTime":"2026-03-01T09:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.541774 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.542097 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.542264 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.542431 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.542658 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:45Z","lastTransitionTime":"2026-03-01T09:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.646451 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.646492 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.646505 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.646521 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.646532 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:45Z","lastTransitionTime":"2026-03-01T09:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.748494 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.748531 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.748541 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.748556 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.748566 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:45Z","lastTransitionTime":"2026-03-01T09:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.850375 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.850413 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.850423 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.850437 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.850447 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:45Z","lastTransitionTime":"2026-03-01T09:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.952741 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.952793 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.952808 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.952826 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.952836 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:45Z","lastTransitionTime":"2026-03-01T09:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.977862 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovnkube-controller/0.log" Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.980332 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerStarted","Data":"52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25"} Mar 01 09:09:45 crc kubenswrapper[4792]: I0301 09:09:45.981203 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.001405 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263abc51f572d7ba71366a0e5f432981bad7e073a91d6fc6bfc0ffa2ca9f23\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:09:44Z\\\",\\\"message\\\":\\\"nPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0301 09:09:44.646724 6493 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0301 09:09:44.646811 6493 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0301 09:09:44.646980 6493 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0301 09:09:44.646998 6493 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0301 09:09:44.647121 6493 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0301 09:09:44.647665 6493 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0301 09:09:44.647688 6493 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0301 09:09:44.647713 6493 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0301 09:09:44.647733 6493 factory.go:656] Stopping watch factory\\\\nI0301 09:09:44.647748 6493 ovnkube.go:599] Stopped ovnkube\\\\nI0301 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.013798 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.024492 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.037213 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.049088 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.054879 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.054927 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.054935 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.054947 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.054962 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:46Z","lastTransitionTime":"2026-03-01T09:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.064305 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.077113 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.091516 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.100970 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.112997 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.126834 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.135309 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.147060 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.156613 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.156649 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.156660 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.156675 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.156685 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:46Z","lastTransitionTime":"2026-03-01T09:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.260278 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.260327 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.260339 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.260358 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.260370 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:46Z","lastTransitionTime":"2026-03-01T09:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.362978 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.363025 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.363038 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.363060 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.363072 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:46Z","lastTransitionTime":"2026-03-01T09:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.465378 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.465436 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.465458 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.465491 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.465510 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:46Z","lastTransitionTime":"2026-03-01T09:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.567361 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.567389 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.567398 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.567411 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.567420 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:46Z","lastTransitionTime":"2026-03-01T09:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.669469 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.669503 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.669512 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.669524 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.669532 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:46Z","lastTransitionTime":"2026-03-01T09:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.772047 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.772119 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.772137 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.772161 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.772218 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:46Z","lastTransitionTime":"2026-03-01T09:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.874048 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.874077 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.874085 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.874097 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.874106 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:46Z","lastTransitionTime":"2026-03-01T09:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.955536 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn"] Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.956130 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.957510 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.957784 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.958283 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9ddb0171-7126-45ef-aea2-8433f52357a6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-k7dvn\" (UID: \"9ddb0171-7126-45ef-aea2-8433f52357a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.958517 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx98x\" (UniqueName: \"kubernetes.io/projected/9ddb0171-7126-45ef-aea2-8433f52357a6-kube-api-access-rx98x\") pod \"ovnkube-control-plane-749d76644c-k7dvn\" (UID: \"9ddb0171-7126-45ef-aea2-8433f52357a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.958733 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9ddb0171-7126-45ef-aea2-8433f52357a6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-k7dvn\" (UID: \"9ddb0171-7126-45ef-aea2-8433f52357a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.958838 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9ddb0171-7126-45ef-aea2-8433f52357a6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-k7dvn\" (UID: \"9ddb0171-7126-45ef-aea2-8433f52357a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.970816 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.976052 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.976087 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.976097 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.976113 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.976124 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:46Z","lastTransitionTime":"2026-03-01T09:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.985064 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovnkube-controller/1.log" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.985500 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovnkube-controller/0.log" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.986839 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.988160 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerDied","Data":"52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25"} Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.988264 4792 scope.go:117] "RemoveContainer" containerID="26263abc51f572d7ba71366a0e5f432981bad7e073a91d6fc6bfc0ffa2ca9f23" Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.988423 4792 generic.go:334] "Generic (PLEG): container finished" podID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerID="52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25" exitCode=1 Mar 01 09:09:46 crc kubenswrapper[4792]: I0301 09:09:46.989957 4792 scope.go:117] "RemoveContainer" containerID="52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25" Mar 01 09:09:46 crc kubenswrapper[4792]: E0301 09:09:46.990202 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.001865 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.013030 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.024012 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.033443 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.045176 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.058031 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.059403 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9ddb0171-7126-45ef-aea2-8433f52357a6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-k7dvn\" (UID: \"9ddb0171-7126-45ef-aea2-8433f52357a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.059465 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx98x\" (UniqueName: \"kubernetes.io/projected/9ddb0171-7126-45ef-aea2-8433f52357a6-kube-api-access-rx98x\") pod \"ovnkube-control-plane-749d76644c-k7dvn\" (UID: \"9ddb0171-7126-45ef-aea2-8433f52357a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.059531 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9ddb0171-7126-45ef-aea2-8433f52357a6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-k7dvn\" (UID: \"9ddb0171-7126-45ef-aea2-8433f52357a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.059557 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9ddb0171-7126-45ef-aea2-8433f52357a6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-k7dvn\" (UID: \"9ddb0171-7126-45ef-aea2-8433f52357a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.060693 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9ddb0171-7126-45ef-aea2-8433f52357a6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-k7dvn\" (UID: \"9ddb0171-7126-45ef-aea2-8433f52357a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.060949 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9ddb0171-7126-45ef-aea2-8433f52357a6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-k7dvn\" (UID: \"9ddb0171-7126-45ef-aea2-8433f52357a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.066383 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9ddb0171-7126-45ef-aea2-8433f52357a6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-k7dvn\" (UID: \"9ddb0171-7126-45ef-aea2-8433f52357a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.068110 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.075841 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx98x\" (UniqueName: \"kubernetes.io/projected/9ddb0171-7126-45ef-aea2-8433f52357a6-kube-api-access-rx98x\") pod \"ovnkube-control-plane-749d76644c-k7dvn\" (UID: \"9ddb0171-7126-45ef-aea2-8433f52357a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.077674 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.077780 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.077853 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.077948 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.078024 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:47Z","lastTransitionTime":"2026-03-01T09:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.081978 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.098297 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263abc51f572d7ba71366a0e5f432981bad7e073a91d6fc6bfc0ffa2ca9f23\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:09:44Z\\\",\\\"message\\\":\\\"nPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0301 09:09:44.646724 6493 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0301 09:09:44.646811 6493 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0301 09:09:44.646980 6493 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0301 09:09:44.646998 6493 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0301 09:09:44.647121 6493 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0301 09:09:44.647665 6493 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0301 09:09:44.647688 6493 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0301 09:09:44.647713 6493 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0301 09:09:44.647733 6493 factory.go:656] Stopping watch factory\\\\nI0301 09:09:44.647748 6493 ovnkube.go:599] Stopped ovnkube\\\\nI0301 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.109249 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddb0171-7126-45ef-aea2-8433f52357a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k7dvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.120249 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.130516 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.142571 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.156430 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.170417 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.180754 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.180821 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.180838 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.180855 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.180866 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:47Z","lastTransitionTime":"2026-03-01T09:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.186557 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.197113 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.208952 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.222959 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.235674 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.254653 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.264810 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.268739 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.279602 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: W0301 09:09:47.282794 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ddb0171_7126_45ef_aea2_8433f52357a6.slice/crio-2b48f991cbaafe8d2f382b896d1059580c6167e8c8b319e3db1398bab76a7ad3 WatchSource:0}: Error finding container 2b48f991cbaafe8d2f382b896d1059580c6167e8c8b319e3db1398bab76a7ad3: Status 404 returned error can't find the container with id 2b48f991cbaafe8d2f382b896d1059580c6167e8c8b319e3db1398bab76a7ad3 Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.282983 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.283003 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.283011 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.283023 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.283031 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:47Z","lastTransitionTime":"2026-03-01T09:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.303122 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26263abc51f572d7ba71366a0e5f432981bad7e073a91d6fc6bfc0ffa2ca9f23\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:09:44Z\\\",\\\"message\\\":\\\"nPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0301 09:09:44.646724 6493 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0301 09:09:44.646811 6493 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0301 09:09:44.646980 6493 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0301 09:09:44.646998 6493 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0301 09:09:44.647121 6493 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0301 09:09:44.647665 6493 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0301 09:09:44.647688 6493 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0301 09:09:44.647713 6493 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0301 09:09:44.647733 6493 factory.go:656] Stopping watch factory\\\\nI0301 09:09:44.647748 6493 ovnkube.go:599] Stopped ovnkube\\\\nI0301 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"message\\\":\\\"rage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0301 09:09:46.070022 6641 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z]\\\\nI0301 09:09:46.070084 6641 services_controller.go:451] Built service openshif\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.318162 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddb0171-7126-45ef-aea2-8433f52357a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k7dvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.330668 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.386805 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.386921 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.386936 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.386953 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.386987 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:47Z","lastTransitionTime":"2026-03-01T09:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.408348 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.408394 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:47 crc kubenswrapper[4792]: E0301 09:09:47.408440 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:09:47 crc kubenswrapper[4792]: E0301 09:09:47.408525 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.408402 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:47 crc kubenswrapper[4792]: E0301 09:09:47.408646 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.489816 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.489847 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.489855 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.489883 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.489892 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:47Z","lastTransitionTime":"2026-03-01T09:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.592660 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.592715 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.592730 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.592751 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.592764 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:47Z","lastTransitionTime":"2026-03-01T09:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.696053 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.696121 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.696134 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.696153 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.696190 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:47Z","lastTransitionTime":"2026-03-01T09:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.798427 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.798466 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.798506 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.798522 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.798533 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:47Z","lastTransitionTime":"2026-03-01T09:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.902181 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.902217 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.902226 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.902238 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.902246 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:47Z","lastTransitionTime":"2026-03-01T09:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.992510 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" event={"ID":"9ddb0171-7126-45ef-aea2-8433f52357a6","Type":"ContainerStarted","Data":"2b48f991cbaafe8d2f382b896d1059580c6167e8c8b319e3db1398bab76a7ad3"} Mar 01 09:09:47 crc kubenswrapper[4792]: I0301 09:09:47.993383 4792 scope.go:117] "RemoveContainer" containerID="52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25" Mar 01 09:09:47 crc kubenswrapper[4792]: E0301 09:09:47.993589 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.005089 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.005151 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.005167 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.005187 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.005227 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:48Z","lastTransitionTime":"2026-03-01T09:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.007937 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.021862 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.036741 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.048318 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.058214 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.067752 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.075956 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.085939 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.097689 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.107369 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.107406 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.107417 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.107433 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.107444 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:48Z","lastTransitionTime":"2026-03-01T09:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.108122 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.119761 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.132277 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.148879 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"message\\\":\\\"rage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0301 09:09:46.070022 6641 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z]\\\\nI0301 09:09:46.070084 6641 services_controller.go:451] Built service openshif\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.165634 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddb0171-7126-45ef-aea2-8433f52357a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k7dvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.210137 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.210190 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.210203 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.210225 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.210238 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:48Z","lastTransitionTime":"2026-03-01T09:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.312423 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.312454 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.312463 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.312475 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.312484 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:48Z","lastTransitionTime":"2026-03-01T09:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.414995 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.415084 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.415110 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.415147 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.415177 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:48Z","lastTransitionTime":"2026-03-01T09:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.518482 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.518535 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.518548 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.518567 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.518584 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:48Z","lastTransitionTime":"2026-03-01T09:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.625839 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.625921 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.625939 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.625955 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.625967 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:48Z","lastTransitionTime":"2026-03-01T09:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.728786 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.728818 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.728829 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.728842 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.728851 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:48Z","lastTransitionTime":"2026-03-01T09:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.798463 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-frm7z"] Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.798988 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:09:48 crc kubenswrapper[4792]: E0301 09:09:48.799062 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.811913 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.825161 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.832032 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.832114 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.832131 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.832148 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.832160 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:48Z","lastTransitionTime":"2026-03-01T09:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.834947 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.846370 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.859022 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.869131 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.874855 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw62d\" (UniqueName: \"kubernetes.io/projected/fa0bf523-6582-46b4-9134-28880a50b474-kube-api-access-gw62d\") pod \"network-metrics-daemon-frm7z\" (UID: \"fa0bf523-6582-46b4-9134-28880a50b474\") " pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.875205 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs\") pod \"network-metrics-daemon-frm7z\" (UID: \"fa0bf523-6582-46b4-9134-28880a50b474\") " pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.882596 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.894528 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.913311 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"message\\\":\\\"rage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0301 09:09:46.070022 6641 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z]\\\\nI0301 09:09:46.070084 6641 services_controller.go:451] Built service openshif\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.925722 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddb0171-7126-45ef-aea2-8433f52357a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k7dvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.936140 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.936184 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.936196 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.936215 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.936226 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:48Z","lastTransitionTime":"2026-03-01T09:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.941013 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.954926 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.967364 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.975817 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs\") pod \"network-metrics-daemon-frm7z\" (UID: \"fa0bf523-6582-46b4-9134-28880a50b474\") " pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.975917 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw62d\" (UniqueName: \"kubernetes.io/projected/fa0bf523-6582-46b4-9134-28880a50b474-kube-api-access-gw62d\") pod \"network-metrics-daemon-frm7z\" (UID: \"fa0bf523-6582-46b4-9134-28880a50b474\") " pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:09:48 crc kubenswrapper[4792]: E0301 09:09:48.976036 4792 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 01 09:09:48 crc kubenswrapper[4792]: E0301 09:09:48.976114 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs podName:fa0bf523-6582-46b4-9134-28880a50b474 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:49.476097057 +0000 UTC m=+118.717976254 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs") pod "network-metrics-daemon-frm7z" (UID: "fa0bf523-6582-46b4-9134-28880a50b474") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.985068 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-frm7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0bf523-6582-46b4-9134-28880a50b474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-frm7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:48 crc kubenswrapper[4792]: I0301 09:09:48.995212 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw62d\" (UniqueName: \"kubernetes.io/projected/fa0bf523-6582-46b4-9134-28880a50b474-kube-api-access-gw62d\") pod \"network-metrics-daemon-frm7z\" (UID: \"fa0bf523-6582-46b4-9134-28880a50b474\") " pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.000115 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:48Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.003804 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" event={"ID":"9ddb0171-7126-45ef-aea2-8433f52357a6","Type":"ContainerStarted","Data":"c1c0585a4ca4551b3b8182c6427a8b2a5b8536cf388b7a9075819593b63502cf"} Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.003848 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" event={"ID":"9ddb0171-7126-45ef-aea2-8433f52357a6","Type":"ContainerStarted","Data":"0af07e226c7efd45ef475f2bb96192734bb29dbd1e16b83dda8a437e0a5fd606"} Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.005797 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovnkube-controller/1.log" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.039489 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.039524 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.039532 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.039547 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.039557 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:49Z","lastTransitionTime":"2026-03-01T09:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.141358 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.141385 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.141393 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.141407 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.141416 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:49Z","lastTransitionTime":"2026-03-01T09:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.243731 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.243779 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.243795 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.243817 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.243835 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:49Z","lastTransitionTime":"2026-03-01T09:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.346142 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.346194 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.346206 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.346224 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.346236 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:49Z","lastTransitionTime":"2026-03-01T09:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.408662 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.408730 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.408826 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.408820 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.408967 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.409038 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.436744 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.436783 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.436795 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.436811 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.436823 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:49Z","lastTransitionTime":"2026-03-01T09:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.448970 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:49Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.452339 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.452379 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.452389 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.452405 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.452416 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:49Z","lastTransitionTime":"2026-03-01T09:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.464509 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:49Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.467770 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.467806 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.467816 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.467830 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.467840 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:49Z","lastTransitionTime":"2026-03-01T09:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.479345 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:49Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.479493 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs\") pod \"network-metrics-daemon-frm7z\" (UID: \"fa0bf523-6582-46b4-9134-28880a50b474\") " pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.479672 4792 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.479742 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs podName:fa0bf523-6582-46b4-9134-28880a50b474 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:50.479724106 +0000 UTC m=+119.721603303 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs") pod "network-metrics-daemon-frm7z" (UID: "fa0bf523-6582-46b4-9134-28880a50b474") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.484438 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.484472 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.484481 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.484494 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.484504 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:49Z","lastTransitionTime":"2026-03-01T09:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.500881 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:49Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.504339 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.504376 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.504386 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.504400 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.504410 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:49Z","lastTransitionTime":"2026-03-01T09:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.516620 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:49Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.516736 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.518303 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.518328 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.518337 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.518350 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.518359 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:49Z","lastTransitionTime":"2026-03-01T09:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.620455 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.620511 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.620523 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.620538 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.620567 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:49Z","lastTransitionTime":"2026-03-01T09:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.681186 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.681319 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.681341 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:10:05.681314255 +0000 UTC m=+134.923193462 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.681413 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.681435 4792 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.681459 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.681493 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.681527 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-01 09:10:05.681517891 +0000 UTC m=+134.923397088 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.681595 4792 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.681638 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-01 09:10:05.681623323 +0000 UTC m=+134.923502610 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.681759 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.681814 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.681833 4792 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.681894 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-01 09:10:05.681871019 +0000 UTC m=+134.923750216 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.681766 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.681944 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.681955 4792 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:49 crc kubenswrapper[4792]: E0301 09:09:49.681990 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-01 09:10:05.681980762 +0000 UTC m=+134.923860059 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.723061 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.723135 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.723144 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.723184 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.723195 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:49Z","lastTransitionTime":"2026-03-01T09:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.825976 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.826016 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.826040 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.826055 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.826065 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:49Z","lastTransitionTime":"2026-03-01T09:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.928395 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.928453 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.928462 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.928478 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:49 crc kubenswrapper[4792]: I0301 09:09:49.928488 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:49Z","lastTransitionTime":"2026-03-01T09:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.030943 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.031018 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.031035 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.031058 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.031107 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:50Z","lastTransitionTime":"2026-03-01T09:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.034347 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:50Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.052172 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:50Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.071383 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:50Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.089378 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:50Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.101092 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:50Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.114920 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:50Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.126402 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:50Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.133301 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.133337 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.133347 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.133365 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.133376 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:50Z","lastTransitionTime":"2026-03-01T09:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.138712 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddb0171-7126-45ef-aea2-8433f52357a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af07e226c7efd45ef475f2bb96192734bb29dbd1e16b83dda8a437e0a5fd606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c0585a4ca4551b3b8182c6427a8b2a5b8536cf388b7a9075819593b63502cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k7dvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:50Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.152134 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:50Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.166875 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:50Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.191713 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"message\\\":\\\"rage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0301 09:09:46.070022 6641 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z]\\\\nI0301 09:09:46.070084 6641 services_controller.go:451] Built service openshif\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:50Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.205492 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:50Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.218887 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:50Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.230938 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:50Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.235546 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.235579 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.235588 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.235602 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.235612 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:50Z","lastTransitionTime":"2026-03-01T09:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.242879 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-frm7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0bf523-6582-46b4-9134-28880a50b474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-frm7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:50Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.338832 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.338882 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.338895 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.338932 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.338952 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:50Z","lastTransitionTime":"2026-03-01T09:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.408537 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:09:50 crc kubenswrapper[4792]: E0301 09:09:50.408689 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.441580 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.441649 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.441662 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.441679 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.441691 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:50Z","lastTransitionTime":"2026-03-01T09:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.489064 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs\") pod \"network-metrics-daemon-frm7z\" (UID: \"fa0bf523-6582-46b4-9134-28880a50b474\") " pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:09:50 crc kubenswrapper[4792]: E0301 09:09:50.489204 4792 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 01 09:09:50 crc kubenswrapper[4792]: E0301 09:09:50.489264 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs podName:fa0bf523-6582-46b4-9134-28880a50b474 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:52.489248502 +0000 UTC m=+121.731127709 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs") pod "network-metrics-daemon-frm7z" (UID: "fa0bf523-6582-46b4-9134-28880a50b474") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.543698 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.543759 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.543780 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.543809 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.543835 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:50Z","lastTransitionTime":"2026-03-01T09:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.647369 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.647420 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.647431 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.647447 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.647457 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:50Z","lastTransitionTime":"2026-03-01T09:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.750353 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.750425 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.750448 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.750478 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.750501 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:50Z","lastTransitionTime":"2026-03-01T09:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.853100 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.853166 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.853177 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.853192 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.853203 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:50Z","lastTransitionTime":"2026-03-01T09:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.955839 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.955874 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.955883 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.955898 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:50 crc kubenswrapper[4792]: I0301 09:09:50.955933 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:50Z","lastTransitionTime":"2026-03-01T09:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.058258 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.058309 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.058326 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.058348 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.058365 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:51Z","lastTransitionTime":"2026-03-01T09:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.160629 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.160666 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.160677 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.160694 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.160706 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:51Z","lastTransitionTime":"2026-03-01T09:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.263601 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.263873 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.264016 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.264092 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.264158 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:51Z","lastTransitionTime":"2026-03-01T09:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:51 crc kubenswrapper[4792]: E0301 09:09:51.364607 4792 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.407725 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.407758 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:51 crc kubenswrapper[4792]: E0301 09:09:51.407934 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.408400 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:51 crc kubenswrapper[4792]: E0301 09:09:51.408505 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:09:51 crc kubenswrapper[4792]: E0301 09:09:51.408600 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.422400 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:51Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.432734 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:51Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.440986 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:51Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.453577 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:51Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.466090 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:51Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.478072 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:51Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.491712 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:51Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.503741 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:51Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.525956 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"message\\\":\\\"rage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0301 09:09:46.070022 6641 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z]\\\\nI0301 09:09:46.070084 6641 services_controller.go:451] Built service openshif\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:51Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.539474 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddb0171-7126-45ef-aea2-8433f52357a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af07e226c7efd45ef475f2bb96192734bb29dbd1e16b83dda8a437e0a5fd606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c0585a4ca4551b3b8182c6427a8b2a5b8536cf388b7a9075819593b63502cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k7dvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:51Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.553059 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:51Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.565139 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:51Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.576329 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-frm7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0bf523-6582-46b4-9134-28880a50b474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-frm7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:51Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.592007 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:51Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:51 crc kubenswrapper[4792]: I0301 09:09:51.604238 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:51Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:51 crc kubenswrapper[4792]: E0301 09:09:51.753989 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:09:52 crc kubenswrapper[4792]: I0301 09:09:52.407924 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:09:52 crc kubenswrapper[4792]: E0301 09:09:52.408065 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:09:52 crc kubenswrapper[4792]: I0301 09:09:52.507744 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs\") pod \"network-metrics-daemon-frm7z\" (UID: \"fa0bf523-6582-46b4-9134-28880a50b474\") " pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:09:52 crc kubenswrapper[4792]: E0301 09:09:52.507861 4792 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 01 09:09:52 crc kubenswrapper[4792]: E0301 09:09:52.507929 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs podName:fa0bf523-6582-46b4-9134-28880a50b474 nodeName:}" failed. No retries permitted until 2026-03-01 09:09:56.507894243 +0000 UTC m=+125.749773440 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs") pod "network-metrics-daemon-frm7z" (UID: "fa0bf523-6582-46b4-9134-28880a50b474") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 01 09:09:53 crc kubenswrapper[4792]: I0301 09:09:53.380812 4792 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 01 09:09:53 crc kubenswrapper[4792]: I0301 09:09:53.408238 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:53 crc kubenswrapper[4792]: I0301 09:09:53.408332 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:53 crc kubenswrapper[4792]: E0301 09:09:53.408469 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:09:53 crc kubenswrapper[4792]: I0301 09:09:53.408579 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:53 crc kubenswrapper[4792]: E0301 09:09:53.408701 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:09:53 crc kubenswrapper[4792]: E0301 09:09:53.408756 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:09:54 crc kubenswrapper[4792]: I0301 09:09:54.407768 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:09:54 crc kubenswrapper[4792]: E0301 09:09:54.408027 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:09:55 crc kubenswrapper[4792]: I0301 09:09:55.407889 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:55 crc kubenswrapper[4792]: I0301 09:09:55.407942 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:55 crc kubenswrapper[4792]: I0301 09:09:55.407889 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:55 crc kubenswrapper[4792]: E0301 09:09:55.408011 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:09:55 crc kubenswrapper[4792]: E0301 09:09:55.408241 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:09:55 crc kubenswrapper[4792]: E0301 09:09:55.408307 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:09:55 crc kubenswrapper[4792]: I0301 09:09:55.408687 4792 scope.go:117] "RemoveContainer" containerID="40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2" Mar 01 09:09:55 crc kubenswrapper[4792]: E0301 09:09:55.408877 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:09:56 crc kubenswrapper[4792]: I0301 09:09:56.408167 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:09:56 crc kubenswrapper[4792]: E0301 09:09:56.408333 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:09:56 crc kubenswrapper[4792]: I0301 09:09:56.548750 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs\") pod \"network-metrics-daemon-frm7z\" (UID: \"fa0bf523-6582-46b4-9134-28880a50b474\") " pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:09:56 crc kubenswrapper[4792]: E0301 09:09:56.549236 4792 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 01 09:09:56 crc kubenswrapper[4792]: E0301 09:09:56.549396 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs podName:fa0bf523-6582-46b4-9134-28880a50b474 nodeName:}" failed. No retries permitted until 2026-03-01 09:10:04.54936556 +0000 UTC m=+133.791244777 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs") pod "network-metrics-daemon-frm7z" (UID: "fa0bf523-6582-46b4-9134-28880a50b474") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 01 09:09:56 crc kubenswrapper[4792]: E0301 09:09:56.754929 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:09:57 crc kubenswrapper[4792]: I0301 09:09:57.408195 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:57 crc kubenswrapper[4792]: I0301 09:09:57.408195 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:57 crc kubenswrapper[4792]: I0301 09:09:57.408227 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:57 crc kubenswrapper[4792]: E0301 09:09:57.408698 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:09:57 crc kubenswrapper[4792]: E0301 09:09:57.408611 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:09:57 crc kubenswrapper[4792]: E0301 09:09:57.409161 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:09:58 crc kubenswrapper[4792]: I0301 09:09:58.407693 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:09:58 crc kubenswrapper[4792]: E0301 09:09:58.407819 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.408217 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.408271 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:09:59 crc kubenswrapper[4792]: E0301 09:09:59.408368 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.408472 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:09:59 crc kubenswrapper[4792]: E0301 09:09:59.409419 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:09:59 crc kubenswrapper[4792]: E0301 09:09:59.409584 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.410010 4792 scope.go:117] "RemoveContainer" containerID="52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.733675 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.733738 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.733759 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.733781 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.733795 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:59Z","lastTransitionTime":"2026-03-01T09:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:59 crc kubenswrapper[4792]: E0301 09:09:59.746893 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:59Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.750409 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.750468 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.750480 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.750497 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.750530 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:59Z","lastTransitionTime":"2026-03-01T09:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:59 crc kubenswrapper[4792]: E0301 09:09:59.762757 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:59Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.765960 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.766006 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.766015 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.766027 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.766036 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:59Z","lastTransitionTime":"2026-03-01T09:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:59 crc kubenswrapper[4792]: E0301 09:09:59.776564 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:59Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.780025 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.780057 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.780067 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.780083 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.780096 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:59Z","lastTransitionTime":"2026-03-01T09:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:59 crc kubenswrapper[4792]: E0301 09:09:59.810285 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:59Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.814430 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.814463 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.814474 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.814486 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:09:59 crc kubenswrapper[4792]: I0301 09:09:59.814494 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:09:59Z","lastTransitionTime":"2026-03-01T09:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:09:59 crc kubenswrapper[4792]: E0301 09:09:59.828186 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:59Z is after 2025-08-24T17:21:41Z" Mar 01 09:09:59 crc kubenswrapper[4792]: E0301 09:09:59.828301 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.046509 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovnkube-controller/1.log" Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.056649 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerStarted","Data":"e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456"} Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.057131 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.074262 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:00Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.089454 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:00Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.101282 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:00Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.111540 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:00Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.126556 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:00Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.140389 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:00Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.150502 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:00Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.159531 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:00Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.178346 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"message\\\":\\\"rage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0301 09:09:46.070022 6641 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z]\\\\nI0301 09:09:46.070084 6641 services_controller.go:451] Built service openshif\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:00Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.189733 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddb0171-7126-45ef-aea2-8433f52357a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af07e226c7efd45ef475f2bb96192734bb29dbd1e16b83dda8a437e0a5fd606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c0585a4ca4551b3b8182c6427a8b2a5b8536cf388b7a9075819593b63502cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k7dvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:00Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.201563 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:00Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.213966 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:00Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.225566 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-frm7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0bf523-6582-46b4-9134-28880a50b474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-frm7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:00Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.238637 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:00Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.249362 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:00Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:00 crc kubenswrapper[4792]: I0301 09:10:00.408377 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:00 crc kubenswrapper[4792]: E0301 09:10:00.408510 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.062157 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovnkube-controller/2.log" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.063192 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovnkube-controller/1.log" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.066417 4792 generic.go:334] "Generic (PLEG): container finished" podID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerID="e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456" exitCode=1 Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.066473 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerDied","Data":"e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456"} Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.066540 4792 scope.go:117] "RemoveContainer" containerID="52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.067529 4792 scope.go:117] "RemoveContainer" containerID="e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456" Mar 01 09:10:01 crc kubenswrapper[4792]: E0301 09:10:01.067816 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.083436 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.096733 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.106404 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-frm7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0bf523-6582-46b4-9134-28880a50b474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-frm7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.120901 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.132817 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.143984 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.153553 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.162128 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.174378 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.187490 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.197787 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.209728 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.220183 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.238170 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"message\\\":\\\"rage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0301 09:09:46.070022 6641 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z]\\\\nI0301 09:09:46.070084 6641 services_controller.go:451] Built service openshif\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:10:00Z\\\",\\\"message\\\":\\\"uid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0301 09:10:00.308291 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.248490 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddb0171-7126-45ef-aea2-8433f52357a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af07e226c7efd45ef475f2bb96192734bb29dbd1e16b83dda8a437e0a5fd606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c0585a4ca4551b3b8182c6427a8b2a5b8536cf388b7a9075819593b63502cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k7dvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.408746 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.408814 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.408875 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:01 crc kubenswrapper[4792]: E0301 09:10:01.409012 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:01 crc kubenswrapper[4792]: E0301 09:10:01.409191 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:01 crc kubenswrapper[4792]: E0301 09:10:01.409318 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.424580 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.438638 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.453165 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.462442 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-frm7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0bf523-6582-46b4-9134-28880a50b474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-frm7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.473574 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.486161 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.497352 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.511061 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.524899 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.538735 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.553162 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.564665 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.576897 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.597005 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52f06d1a6aa3f9871d55b3db6eb32ffacdf68fded2a68958577c63d060e19d25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"message\\\":\\\"rage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0301 09:09:46.070022 6641 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:09:46Z is after 2025-08-24T17:21:41Z]\\\\nI0301 09:09:46.070084 6641 services_controller.go:451] Built service openshif\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:10:00Z\\\",\\\"message\\\":\\\"uid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0301 09:10:00.308291 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: I0301 09:10:01.606341 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddb0171-7126-45ef-aea2-8433f52357a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af07e226c7efd45ef475f2bb96192734bb29dbd1e16b83dda8a437e0a5fd606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c0585a4ca4551b3b8182c6427a8b2a5b8536cf388b7a9075819593b63502cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k7dvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:01Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:01 crc kubenswrapper[4792]: E0301 09:10:01.756743 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:10:02 crc kubenswrapper[4792]: I0301 09:10:02.071310 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovnkube-controller/2.log" Mar 01 09:10:02 crc kubenswrapper[4792]: I0301 09:10:02.076643 4792 scope.go:117] "RemoveContainer" containerID="e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456" Mar 01 09:10:02 crc kubenswrapper[4792]: E0301 09:10:02.076953 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" Mar 01 09:10:02 crc kubenswrapper[4792]: I0301 09:10:02.091512 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:02Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:02 crc kubenswrapper[4792]: I0301 09:10:02.111188 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:02Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:02 crc kubenswrapper[4792]: I0301 09:10:02.124415 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:02Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:02 crc kubenswrapper[4792]: I0301 09:10:02.143786 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:02Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:02 crc kubenswrapper[4792]: I0301 09:10:02.166260 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:02Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:02 crc kubenswrapper[4792]: I0301 09:10:02.177386 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:02Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:02 crc kubenswrapper[4792]: I0301 09:10:02.190852 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:02Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:02 crc kubenswrapper[4792]: I0301 09:10:02.203583 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:02Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:02 crc kubenswrapper[4792]: I0301 09:10:02.222339 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:10:00Z\\\",\\\"message\\\":\\\"uid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0301 09:10:00.308291 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:02Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:02 crc kubenswrapper[4792]: I0301 09:10:02.238029 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddb0171-7126-45ef-aea2-8433f52357a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af07e226c7efd45ef475f2bb96192734bb29dbd1e16b83dda8a437e0a5fd606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c0585a4ca4551b3b8182c6427a8b2a5b8536cf388b7a9075819593b63502cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k7dvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:02Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:02 crc kubenswrapper[4792]: I0301 09:10:02.254568 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:02Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:02 crc kubenswrapper[4792]: I0301 09:10:02.268943 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:02Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:02 crc kubenswrapper[4792]: I0301 09:10:02.283933 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-frm7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0bf523-6582-46b4-9134-28880a50b474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-frm7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:02Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:02 crc kubenswrapper[4792]: I0301 09:10:02.299964 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:02Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:02 crc kubenswrapper[4792]: I0301 09:10:02.314301 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:02Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:02 crc kubenswrapper[4792]: I0301 09:10:02.407850 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:02 crc kubenswrapper[4792]: E0301 09:10:02.408015 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:03 crc kubenswrapper[4792]: I0301 09:10:03.408073 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:03 crc kubenswrapper[4792]: I0301 09:10:03.408181 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:03 crc kubenswrapper[4792]: I0301 09:10:03.408072 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:03 crc kubenswrapper[4792]: E0301 09:10:03.408336 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:03 crc kubenswrapper[4792]: E0301 09:10:03.408410 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:03 crc kubenswrapper[4792]: E0301 09:10:03.408623 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:04 crc kubenswrapper[4792]: I0301 09:10:04.408397 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:04 crc kubenswrapper[4792]: E0301 09:10:04.409193 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:04 crc kubenswrapper[4792]: I0301 09:10:04.631461 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs\") pod \"network-metrics-daemon-frm7z\" (UID: \"fa0bf523-6582-46b4-9134-28880a50b474\") " pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:04 crc kubenswrapper[4792]: E0301 09:10:04.631717 4792 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 01 09:10:04 crc kubenswrapper[4792]: E0301 09:10:04.631875 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs podName:fa0bf523-6582-46b4-9134-28880a50b474 nodeName:}" failed. No retries permitted until 2026-03-01 09:10:20.631845054 +0000 UTC m=+149.873724291 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs") pod "network-metrics-daemon-frm7z" (UID: "fa0bf523-6582-46b4-9134-28880a50b474") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 01 09:10:05 crc kubenswrapper[4792]: I0301 09:10:05.408600 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:05 crc kubenswrapper[4792]: I0301 09:10:05.408600 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:05 crc kubenswrapper[4792]: E0301 09:10:05.408765 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:05 crc kubenswrapper[4792]: E0301 09:10:05.408832 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:05 crc kubenswrapper[4792]: I0301 09:10:05.409646 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:05 crc kubenswrapper[4792]: E0301 09:10:05.409994 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:05 crc kubenswrapper[4792]: I0301 09:10:05.418862 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 01 09:10:05 crc kubenswrapper[4792]: I0301 09:10:05.741811 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:10:05 crc kubenswrapper[4792]: I0301 09:10:05.741928 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:05 crc kubenswrapper[4792]: I0301 09:10:05.741996 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:05 crc kubenswrapper[4792]: I0301 09:10:05.742032 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:05 crc kubenswrapper[4792]: E0301 09:10:05.742056 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:10:37.742028416 +0000 UTC m=+166.983907633 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:10:05 crc kubenswrapper[4792]: I0301 09:10:05.742091 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:05 crc kubenswrapper[4792]: E0301 09:10:05.742152 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 01 09:10:05 crc kubenswrapper[4792]: E0301 09:10:05.742155 4792 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 01 09:10:05 crc kubenswrapper[4792]: E0301 09:10:05.742214 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 01 09:10:05 crc kubenswrapper[4792]: E0301 09:10:05.742217 4792 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 01 09:10:05 crc kubenswrapper[4792]: E0301 09:10:05.742225 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 01 09:10:05 crc kubenswrapper[4792]: E0301 09:10:05.742235 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-01 09:10:37.74221555 +0000 UTC m=+166.984094747 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 01 09:10:05 crc kubenswrapper[4792]: E0301 09:10:05.742240 4792 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:10:05 crc kubenswrapper[4792]: E0301 09:10:05.742170 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 01 09:10:05 crc kubenswrapper[4792]: E0301 09:10:05.742274 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-01 09:10:37.742258931 +0000 UTC m=+166.984138238 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 01 09:10:05 crc kubenswrapper[4792]: E0301 09:10:05.742274 4792 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:10:05 crc kubenswrapper[4792]: E0301 09:10:05.742295 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-01 09:10:37.742284172 +0000 UTC m=+166.984163509 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:10:05 crc kubenswrapper[4792]: E0301 09:10:05.742317 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-01 09:10:37.742307292 +0000 UTC m=+166.984186619 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:10:06 crc kubenswrapper[4792]: I0301 09:10:06.408431 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:06 crc kubenswrapper[4792]: E0301 09:10:06.408889 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:06 crc kubenswrapper[4792]: E0301 09:10:06.758958 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:10:07 crc kubenswrapper[4792]: I0301 09:10:07.408503 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:07 crc kubenswrapper[4792]: I0301 09:10:07.408518 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:07 crc kubenswrapper[4792]: E0301 09:10:07.408723 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:07 crc kubenswrapper[4792]: E0301 09:10:07.408854 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:07 crc kubenswrapper[4792]: I0301 09:10:07.408965 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:07 crc kubenswrapper[4792]: E0301 09:10:07.409480 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:07 crc kubenswrapper[4792]: I0301 09:10:07.409588 4792 scope.go:117] "RemoveContainer" containerID="40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2" Mar 01 09:10:07 crc kubenswrapper[4792]: E0301 09:10:07.409868 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:10:07 crc kubenswrapper[4792]: I0301 09:10:07.423622 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 01 09:10:08 crc kubenswrapper[4792]: I0301 09:10:08.408447 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:08 crc kubenswrapper[4792]: E0301 09:10:08.408773 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:09 crc kubenswrapper[4792]: I0301 09:10:09.408660 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:09 crc kubenswrapper[4792]: E0301 09:10:09.409289 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:09 crc kubenswrapper[4792]: I0301 09:10:09.408712 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:09 crc kubenswrapper[4792]: E0301 09:10:09.409531 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:09 crc kubenswrapper[4792]: I0301 09:10:09.408660 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:09 crc kubenswrapper[4792]: E0301 09:10:09.409784 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:09 crc kubenswrapper[4792]: I0301 09:10:09.997767 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:10:09 crc kubenswrapper[4792]: I0301 09:10:09.998098 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:10:09 crc kubenswrapper[4792]: I0301 09:10:09.998184 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:10:09 crc kubenswrapper[4792]: I0301 09:10:09.998285 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:10:09 crc kubenswrapper[4792]: I0301 09:10:09.998625 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:10:09Z","lastTransitionTime":"2026-03-01T09:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:10:10 crc kubenswrapper[4792]: E0301 09:10:10.011645 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:10Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.016891 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.017006 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.017028 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.017060 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.017082 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:10:10Z","lastTransitionTime":"2026-03-01T09:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:10:10 crc kubenswrapper[4792]: E0301 09:10:10.033337 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:10Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.037071 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.037110 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.037119 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.037136 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.037146 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:10:10Z","lastTransitionTime":"2026-03-01T09:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:10:10 crc kubenswrapper[4792]: E0301 09:10:10.047943 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:10Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.053204 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.053338 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.053409 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.053506 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.053599 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:10:10Z","lastTransitionTime":"2026-03-01T09:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:10:10 crc kubenswrapper[4792]: E0301 09:10:10.067765 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:10Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.071932 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.072078 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.072161 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.072254 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.072355 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:10:10Z","lastTransitionTime":"2026-03-01T09:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:10:10 crc kubenswrapper[4792]: E0301 09:10:10.086580 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:10Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:10 crc kubenswrapper[4792]: E0301 09:10:10.086996 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 01 09:10:10 crc kubenswrapper[4792]: I0301 09:10:10.407999 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:10 crc kubenswrapper[4792]: E0301 09:10:10.408430 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.408562 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.408642 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.408571 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:11 crc kubenswrapper[4792]: E0301 09:10:11.408797 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:11 crc kubenswrapper[4792]: E0301 09:10:11.408960 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:11 crc kubenswrapper[4792]: E0301 09:10:11.409134 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.428074 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:11Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.445873 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a80515-bc6a-4158-8fc7-835f324381b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://584d540504344f06dee541ba7491978c79d823f7bea306efe216719f32cfbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81dff9998b1dc46b4b8cf890b0a1cf9f6201067af0aa512dfff4ad0c3688d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b89e60e36b6e5a848c0d033d1ba3cfb1c58af2a1288bfd5612ade34b0c24e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e046a8fc92c9bbbcba4c1a83c870045a2986787c2d49ec55bd3612f8a38d34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e046a8fc92c9bbbcba4c1a83c870045a2986787c2d49ec55bd3612f8a38d34f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:11Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.464673 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:11Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.478019 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:11Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.488830 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:11Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.502685 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:11Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.523127 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:11Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.536532 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:11Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.547896 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:11Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.569406 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:10:00Z\\\",\\\"message\\\":\\\"uid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0301 09:10:00.308291 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:11Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.581753 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddb0171-7126-45ef-aea2-8433f52357a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af07e226c7efd45ef475f2bb96192734bb29dbd1e16b83dda8a437e0a5fd606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c0585a4ca4551b3b8182c6427a8b2a5b8536cf388b7a9075819593b63502cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k7dvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:11Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.597820 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:11Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.613131 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:11Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.625353 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:11Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.636437 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-frm7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0bf523-6582-46b4-9134-28880a50b474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-frm7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:11Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.651090 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650a1f7b-2bf5-4a55-87af-ecee46abe3ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8130bb3020d8a95b40f9675531603ec0382a75d197b88e9b13b7f3bf3c0ca5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4a35ca4af33a2e93167010c2d2d02e6ef5fa5c4f65b6b87b344b58cc0a3867\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:08:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0301 09:08:18.955011 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0301 09:08:18.956745 1 observer_polling.go:159] Starting file observer\\\\nI0301 09:08:18.958204 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0301 09:08:18.959319 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0301 09:08:42.859426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0301 09:08:48.505666 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0301 09:08:48.505786 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:08:18Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://859afd52205b95f45333b5975e67308ee360d3887aa679a922b7cba5e3acda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93e0bcca6e83da45b6dd64db0083da5cc4045d25d7a66615d09f8674cc02433\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a213f48ea7aba6dc2e3a9aa8d171acb6aa5c22532eed941bdc337e89b76589\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:11Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:11 crc kubenswrapper[4792]: I0301 09:10:11.666386 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:11Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:11 crc kubenswrapper[4792]: E0301 09:10:11.759970 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:10:12 crc kubenswrapper[4792]: I0301 09:10:12.407760 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:12 crc kubenswrapper[4792]: E0301 09:10:12.408016 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:13 crc kubenswrapper[4792]: I0301 09:10:13.407769 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:13 crc kubenswrapper[4792]: E0301 09:10:13.407890 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:13 crc kubenswrapper[4792]: I0301 09:10:13.407963 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:13 crc kubenswrapper[4792]: E0301 09:10:13.408027 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:13 crc kubenswrapper[4792]: I0301 09:10:13.408075 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:13 crc kubenswrapper[4792]: E0301 09:10:13.408144 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:14 crc kubenswrapper[4792]: I0301 09:10:14.408292 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:14 crc kubenswrapper[4792]: E0301 09:10:14.408534 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:14 crc kubenswrapper[4792]: I0301 09:10:14.409369 4792 scope.go:117] "RemoveContainer" containerID="e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456" Mar 01 09:10:14 crc kubenswrapper[4792]: E0301 09:10:14.409567 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" Mar 01 09:10:15 crc kubenswrapper[4792]: I0301 09:10:15.408316 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:15 crc kubenswrapper[4792]: I0301 09:10:15.408316 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:15 crc kubenswrapper[4792]: E0301 09:10:15.408430 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:15 crc kubenswrapper[4792]: E0301 09:10:15.408489 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:15 crc kubenswrapper[4792]: I0301 09:10:15.408334 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:15 crc kubenswrapper[4792]: E0301 09:10:15.408549 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:16 crc kubenswrapper[4792]: I0301 09:10:16.407804 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:16 crc kubenswrapper[4792]: E0301 09:10:16.407950 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:16 crc kubenswrapper[4792]: E0301 09:10:16.761384 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:10:17 crc kubenswrapper[4792]: I0301 09:10:17.408762 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:17 crc kubenswrapper[4792]: I0301 09:10:17.408806 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:17 crc kubenswrapper[4792]: I0301 09:10:17.408891 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:17 crc kubenswrapper[4792]: E0301 09:10:17.409099 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:17 crc kubenswrapper[4792]: E0301 09:10:17.409234 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:17 crc kubenswrapper[4792]: E0301 09:10:17.409363 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:18 crc kubenswrapper[4792]: I0301 09:10:18.408534 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:18 crc kubenswrapper[4792]: E0301 09:10:18.409345 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:19 crc kubenswrapper[4792]: I0301 09:10:19.408031 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:19 crc kubenswrapper[4792]: I0301 09:10:19.408126 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:19 crc kubenswrapper[4792]: I0301 09:10:19.408187 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:19 crc kubenswrapper[4792]: E0301 09:10:19.409402 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:19 crc kubenswrapper[4792]: E0301 09:10:19.409481 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:19 crc kubenswrapper[4792]: E0301 09:10:19.409530 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.133664 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.133695 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.133704 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.133717 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.133727 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:10:20Z","lastTransitionTime":"2026-03-01T09:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:10:20 crc kubenswrapper[4792]: E0301 09:10:20.145808 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:20Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.149740 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.149777 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.149786 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.149808 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.149817 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:10:20Z","lastTransitionTime":"2026-03-01T09:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:10:20 crc kubenswrapper[4792]: E0301 09:10:20.161563 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:20Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.164732 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.164787 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.164799 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.164817 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.165209 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:10:20Z","lastTransitionTime":"2026-03-01T09:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:10:20 crc kubenswrapper[4792]: E0301 09:10:20.176244 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:20Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.178991 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.179026 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.179035 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.179049 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.179058 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:10:20Z","lastTransitionTime":"2026-03-01T09:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:10:20 crc kubenswrapper[4792]: E0301 09:10:20.194502 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:20Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.197894 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.197948 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.197958 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.197973 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.197981 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:10:20Z","lastTransitionTime":"2026-03-01T09:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:10:20 crc kubenswrapper[4792]: E0301 09:10:20.211046 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:20Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:20 crc kubenswrapper[4792]: E0301 09:10:20.211215 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.408397 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:20 crc kubenswrapper[4792]: E0301 09:10:20.408587 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:20 crc kubenswrapper[4792]: I0301 09:10:20.697350 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs\") pod \"network-metrics-daemon-frm7z\" (UID: \"fa0bf523-6582-46b4-9134-28880a50b474\") " pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:20 crc kubenswrapper[4792]: E0301 09:10:20.697666 4792 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 01 09:10:20 crc kubenswrapper[4792]: E0301 09:10:20.698648 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs podName:fa0bf523-6582-46b4-9134-28880a50b474 nodeName:}" failed. No retries permitted until 2026-03-01 09:10:52.697783158 +0000 UTC m=+181.939662405 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs") pod "network-metrics-daemon-frm7z" (UID: "fa0bf523-6582-46b4-9134-28880a50b474") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.408852 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.408921 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.408859 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:21 crc kubenswrapper[4792]: E0301 09:10:21.409020 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:21 crc kubenswrapper[4792]: E0301 09:10:21.409155 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:21 crc kubenswrapper[4792]: E0301 09:10:21.409207 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.421577 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:21Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.433144 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:21Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.455863 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:10:00Z\\\",\\\"message\\\":\\\"uid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0301 09:10:00.308291 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:21Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.469346 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddb0171-7126-45ef-aea2-8433f52357a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af07e226c7efd45ef475f2bb96192734bb29dbd1e16b83dda8a437e0a5fd606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c0585a4ca4551b3b8182c6427a8b2a5b8536cf388b7a9075819593b63502cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k7dvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:21Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.482109 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:21Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.496813 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:21Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.508266 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:21Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.524189 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-frm7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0bf523-6582-46b4-9134-28880a50b474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-frm7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:21Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.536238 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650a1f7b-2bf5-4a55-87af-ecee46abe3ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8130bb3020d8a95b40f9675531603ec0382a75d197b88e9b13b7f3bf3c0ca5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4a35ca4af33a2e93167010c2d2d02e6ef5fa5c4f65b6b87b344b58cc0a3867\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:08:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0301 09:08:18.955011 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0301 09:08:18.956745 1 observer_polling.go:159] Starting file observer\\\\nI0301 09:08:18.958204 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0301 09:08:18.959319 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0301 09:08:42.859426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0301 09:08:48.505666 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0301 09:08:48.505786 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:08:18Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://859afd52205b95f45333b5975e67308ee360d3887aa679a922b7cba5e3acda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93e0bcca6e83da45b6dd64db0083da5cc4045d25d7a66615d09f8674cc02433\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a213f48ea7aba6dc2e3a9aa8d171acb6aa5c22532eed941bdc337e89b76589\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:21Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.548752 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:21Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.564042 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:21Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.576899 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:21Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.589937 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a80515-bc6a-4158-8fc7-835f324381b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://584d540504344f06dee541ba7491978c79d823f7bea306efe216719f32cfbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81dff9998b1dc46b4b8cf890b0a1cf9f6201067af0aa512dfff4ad0c3688d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b89e60e36b6e5a848c0d033d1ba3cfb1c58af2a1288bfd5612ade34b0c24e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e046a8fc92c9bbbcba4c1a83c870045a2986787c2d49ec55bd3612f8a38d34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e046a8fc92c9bbbcba4c1a83c870045a2986787c2d49ec55bd3612f8a38d34f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:21Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.605166 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:21Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.616323 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:21Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.625386 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:21Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:21 crc kubenswrapper[4792]: I0301 09:10:21.636853 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:21Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:21 crc kubenswrapper[4792]: E0301 09:10:21.762607 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:10:22 crc kubenswrapper[4792]: I0301 09:10:22.407786 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:22 crc kubenswrapper[4792]: E0301 09:10:22.408242 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:22 crc kubenswrapper[4792]: I0301 09:10:22.408332 4792 scope.go:117] "RemoveContainer" containerID="40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2" Mar 01 09:10:22 crc kubenswrapper[4792]: E0301 09:10:22.408481 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.140053 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pq28p_ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3/kube-multus/0.log" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.140126 4792 generic.go:334] "Generic (PLEG): container finished" podID="ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3" containerID="239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae" exitCode=1 Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.140160 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pq28p" event={"ID":"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3","Type":"ContainerDied","Data":"239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae"} Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.140559 4792 scope.go:117] "RemoveContainer" containerID="239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.160415 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650a1f7b-2bf5-4a55-87af-ecee46abe3ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8130bb3020d8a95b40f9675531603ec0382a75d197b88e9b13b7f3bf3c0ca5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4a35ca4af33a2e93167010c2d2d02e6ef5fa5c4f65b6b87b344b58cc0a3867\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:08:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0301 09:08:18.955011 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0301 09:08:18.956745 1 observer_polling.go:159] Starting file observer\\\\nI0301 09:08:18.958204 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0301 09:08:18.959319 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0301 09:08:42.859426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0301 09:08:48.505666 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0301 09:08:48.505786 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:08:18Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://859afd52205b95f45333b5975e67308ee360d3887aa679a922b7cba5e3acda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93e0bcca6e83da45b6dd64db0083da5cc4045d25d7a66615d09f8674cc02433\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a213f48ea7aba6dc2e3a9aa8d171acb6aa5c22532eed941bdc337e89b76589\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:23Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.175291 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:23Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.189850 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:23Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.200879 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:23Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.210984 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:23Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.225550 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:10:22Z\\\",\\\"message\\\":\\\"2026-03-01T09:09:36+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f5445e48-670b-4a9c-9944-fbd0231ee556\\\\n2026-03-01T09:09:36+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f5445e48-670b-4a9c-9944-fbd0231ee556 to /host/opt/cni/bin/\\\\n2026-03-01T09:09:37Z [verbose] multus-daemon started\\\\n2026-03-01T09:09:37Z [verbose] Readiness Indicator file check\\\\n2026-03-01T09:10:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:23Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.242478 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:23Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.252987 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:23Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.264793 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a80515-bc6a-4158-8fc7-835f324381b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://584d540504344f06dee541ba7491978c79d823f7bea306efe216719f32cfbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81dff9998b1dc46b4b8cf890b0a1cf9f6201067af0aa512dfff4ad0c3688d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b89e60e36b6e5a848c0d033d1ba3cfb1c58af2a1288bfd5612ade34b0c24e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e046a8fc92c9bbbcba4c1a83c870045a2986787c2d49ec55bd3612f8a38d34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e046a8fc92c9bbbcba4c1a83c870045a2986787c2d49ec55bd3612f8a38d34f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:23Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.276103 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:23Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.292189 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:10:00Z\\\",\\\"message\\\":\\\"uid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0301 09:10:00.308291 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:23Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.303256 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddb0171-7126-45ef-aea2-8433f52357a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af07e226c7efd45ef475f2bb96192734bb29dbd1e16b83dda8a437e0a5fd606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c0585a4ca4551b3b8182c6427a8b2a5b8536cf388b7a9075819593b63502cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k7dvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:23Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.316355 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:23Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.329190 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:23Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.337942 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-frm7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0bf523-6582-46b4-9134-28880a50b474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-frm7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:23Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.349177 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:23Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.360749 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:23Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.407949 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:23 crc kubenswrapper[4792]: E0301 09:10:23.408069 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.408110 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:23 crc kubenswrapper[4792]: I0301 09:10:23.407949 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:23 crc kubenswrapper[4792]: E0301 09:10:23.408234 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:23 crc kubenswrapper[4792]: E0301 09:10:23.408277 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.145169 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pq28p_ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3/kube-multus/0.log" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.145225 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pq28p" event={"ID":"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3","Type":"ContainerStarted","Data":"833126c3956c8927e1b68252bd9962e43df9c6e09dc2b98a20208c2db19a5fc1"} Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.158795 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:24Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.171544 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:24Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.189975 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:10:00Z\\\",\\\"message\\\":\\\"uid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0301 09:10:00.308291 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:24Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.202406 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddb0171-7126-45ef-aea2-8433f52357a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af07e226c7efd45ef475f2bb96192734bb29dbd1e16b83dda8a437e0a5fd606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c0585a4ca4551b3b8182c6427a8b2a5b8536cf388b7a9075819593b63502cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k7dvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:24Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.214769 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:24Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.227354 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:24Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.239945 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:24Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.250249 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-frm7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0bf523-6582-46b4-9134-28880a50b474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-frm7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:24Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.262196 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650a1f7b-2bf5-4a55-87af-ecee46abe3ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8130bb3020d8a95b40f9675531603ec0382a75d197b88e9b13b7f3bf3c0ca5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4a35ca4af33a2e93167010c2d2d02e6ef5fa5c4f65b6b87b344b58cc0a3867\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:08:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0301 09:08:18.955011 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0301 09:08:18.956745 1 observer_polling.go:159] Starting file observer\\\\nI0301 09:08:18.958204 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0301 09:08:18.959319 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0301 09:08:42.859426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0301 09:08:48.505666 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0301 09:08:48.505786 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:08:18Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://859afd52205b95f45333b5975e67308ee360d3887aa679a922b7cba5e3acda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93e0bcca6e83da45b6dd64db0083da5cc4045d25d7a66615d09f8674cc02433\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a213f48ea7aba6dc2e3a9aa8d171acb6aa5c22532eed941bdc337e89b76589\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:24Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.277738 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:24Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.290035 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a80515-bc6a-4158-8fc7-835f324381b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://584d540504344f06dee541ba7491978c79d823f7bea306efe216719f32cfbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81dff9998b1dc46b4b8cf890b0a1cf9f6201067af0aa512dfff4ad0c3688d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b89e60e36b6e5a848c0d033d1ba3cfb1c58af2a1288bfd5612ade34b0c24e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e046a8fc92c9bbbcba4c1a83c870045a2986787c2d49ec55bd3612f8a38d34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e046a8fc92c9bbbcba4c1a83c870045a2986787c2d49ec55bd3612f8a38d34f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:24Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.302036 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:24Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.312539 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:24Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.321325 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:24Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.331518 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://833126c3956c8927e1b68252bd9962e43df9c6e09dc2b98a20208c2db19a5fc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:10:22Z\\\",\\\"message\\\":\\\"2026-03-01T09:09:36+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f5445e48-670b-4a9c-9944-fbd0231ee556\\\\n2026-03-01T09:09:36+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f5445e48-670b-4a9c-9944-fbd0231ee556 to /host/opt/cni/bin/\\\\n2026-03-01T09:09:37Z [verbose] multus-daemon started\\\\n2026-03-01T09:09:37Z [verbose] Readiness Indicator file check\\\\n2026-03-01T09:10:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:24Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.345983 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:24Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.356230 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:24Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:24 crc kubenswrapper[4792]: I0301 09:10:24.408573 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:24 crc kubenswrapper[4792]: E0301 09:10:24.408716 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:25 crc kubenswrapper[4792]: I0301 09:10:25.407981 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:25 crc kubenswrapper[4792]: E0301 09:10:25.408105 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:25 crc kubenswrapper[4792]: I0301 09:10:25.408293 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:25 crc kubenswrapper[4792]: E0301 09:10:25.408354 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:25 crc kubenswrapper[4792]: I0301 09:10:25.408500 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:25 crc kubenswrapper[4792]: E0301 09:10:25.408559 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:26 crc kubenswrapper[4792]: I0301 09:10:26.408604 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:26 crc kubenswrapper[4792]: E0301 09:10:26.408751 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:26 crc kubenswrapper[4792]: I0301 09:10:26.409474 4792 scope.go:117] "RemoveContainer" containerID="e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456" Mar 01 09:10:26 crc kubenswrapper[4792]: E0301 09:10:26.764263 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.157241 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovnkube-controller/2.log" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.161602 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerStarted","Data":"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c"} Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.162118 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.175872 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddb0171-7126-45ef-aea2-8433f52357a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af07e226c7efd45ef475f2bb96192734bb29dbd1e16b83dda8a437e0a5fd606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c0585a4ca4551b3b8182c6427a8b2a5b8536cf388b7a9075819593b63502cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k7dvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:27Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.190721 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:27Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.202880 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:27Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.232009 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:10:00Z\\\",\\\"message\\\":\\\"uid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0301 09:10:00.308291 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:27Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.246004 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:27Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.264506 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:27Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.281230 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:27Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.296516 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-frm7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0bf523-6582-46b4-9134-28880a50b474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-frm7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:27Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.314298 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650a1f7b-2bf5-4a55-87af-ecee46abe3ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8130bb3020d8a95b40f9675531603ec0382a75d197b88e9b13b7f3bf3c0ca5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4a35ca4af33a2e93167010c2d2d02e6ef5fa5c4f65b6b87b344b58cc0a3867\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:08:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0301 09:08:18.955011 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0301 09:08:18.956745 1 observer_polling.go:159] Starting file observer\\\\nI0301 09:08:18.958204 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0301 09:08:18.959319 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0301 09:08:42.859426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0301 09:08:48.505666 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0301 09:08:48.505786 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:08:18Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://859afd52205b95f45333b5975e67308ee360d3887aa679a922b7cba5e3acda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93e0bcca6e83da45b6dd64db0083da5cc4045d25d7a66615d09f8674cc02433\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a213f48ea7aba6dc2e3a9aa8d171acb6aa5c22532eed941bdc337e89b76589\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:27Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.332652 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:27Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.346357 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:27Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.362763 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://833126c3956c8927e1b68252bd9962e43df9c6e09dc2b98a20208c2db19a5fc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:10:22Z\\\",\\\"message\\\":\\\"2026-03-01T09:09:36+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f5445e48-670b-4a9c-9944-fbd0231ee556\\\\n2026-03-01T09:09:36+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f5445e48-670b-4a9c-9944-fbd0231ee556 to /host/opt/cni/bin/\\\\n2026-03-01T09:09:37Z [verbose] multus-daemon started\\\\n2026-03-01T09:09:37Z [verbose] Readiness Indicator file check\\\\n2026-03-01T09:10:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:27Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.377574 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:27Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.387538 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:27Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.397790 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a80515-bc6a-4158-8fc7-835f324381b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://584d540504344f06dee541ba7491978c79d823f7bea306efe216719f32cfbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81dff9998b1dc46b4b8cf890b0a1cf9f6201067af0aa512dfff4ad0c3688d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b89e60e36b6e5a848c0d033d1ba3cfb1c58af2a1288bfd5612ade34b0c24e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e046a8fc92c9bbbcba4c1a83c870045a2986787c2d49ec55bd3612f8a38d34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e046a8fc92c9bbbcba4c1a83c870045a2986787c2d49ec55bd3612f8a38d34f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:27Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.408390 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:27 crc kubenswrapper[4792]: E0301 09:10:27.408533 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.408755 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:27 crc kubenswrapper[4792]: E0301 09:10:27.408882 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.408963 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:27 crc kubenswrapper[4792]: E0301 09:10:27.409035 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.411195 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:27Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:27 crc kubenswrapper[4792]: I0301 09:10:27.422979 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:27Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.167807 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovnkube-controller/3.log" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.168726 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovnkube-controller/2.log" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.172679 4792 generic.go:334] "Generic (PLEG): container finished" podID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerID="157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c" exitCode=1 Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.172716 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerDied","Data":"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c"} Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.172753 4792 scope.go:117] "RemoveContainer" containerID="e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.175709 4792 scope.go:117] "RemoveContainer" containerID="157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c" Mar 01 09:10:28 crc kubenswrapper[4792]: E0301 09:10:28.176117 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.190170 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:28Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.204835 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:28Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.222860 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e630381834b4882d359f009f3dad3adfc5d3449fea186f26ba2794a777d2d456\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:10:00Z\\\",\\\"message\\\":\\\"uid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0301 09:10:00.308291 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:10:27Z\\\",\\\"message\\\":\\\"nityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator-webhook_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-webhook\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.254\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0301 09:10:27.273332 7207 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:28Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.233331 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddb0171-7126-45ef-aea2-8433f52357a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af07e226c7efd45ef475f2bb96192734bb29dbd1e16b83dda8a437e0a5fd606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c0585a4ca4551b3b8182c6427a8b2a5b8536cf388b7a9075819593b63502cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k7dvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:28Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.246046 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:28Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.259657 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:28Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.273378 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:28Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.285978 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-frm7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0bf523-6582-46b4-9134-28880a50b474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-frm7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:28Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.298737 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650a1f7b-2bf5-4a55-87af-ecee46abe3ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8130bb3020d8a95b40f9675531603ec0382a75d197b88e9b13b7f3bf3c0ca5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4a35ca4af33a2e93167010c2d2d02e6ef5fa5c4f65b6b87b344b58cc0a3867\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:08:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0301 09:08:18.955011 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0301 09:08:18.956745 1 observer_polling.go:159] Starting file observer\\\\nI0301 09:08:18.958204 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0301 09:08:18.959319 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0301 09:08:42.859426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0301 09:08:48.505666 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0301 09:08:48.505786 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:08:18Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://859afd52205b95f45333b5975e67308ee360d3887aa679a922b7cba5e3acda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93e0bcca6e83da45b6dd64db0083da5cc4045d25d7a66615d09f8674cc02433\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a213f48ea7aba6dc2e3a9aa8d171acb6aa5c22532eed941bdc337e89b76589\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:28Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.312520 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:28Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.327720 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://833126c3956c8927e1b68252bd9962e43df9c6e09dc2b98a20208c2db19a5fc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:10:22Z\\\",\\\"message\\\":\\\"2026-03-01T09:09:36+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f5445e48-670b-4a9c-9944-fbd0231ee556\\\\n2026-03-01T09:09:36+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f5445e48-670b-4a9c-9944-fbd0231ee556 to /host/opt/cni/bin/\\\\n2026-03-01T09:09:37Z [verbose] multus-daemon started\\\\n2026-03-01T09:09:37Z [verbose] Readiness Indicator file check\\\\n2026-03-01T09:10:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:28Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.342360 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:28Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.353007 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:28Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.365037 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a80515-bc6a-4158-8fc7-835f324381b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://584d540504344f06dee541ba7491978c79d823f7bea306efe216719f32cfbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81dff9998b1dc46b4b8cf890b0a1cf9f6201067af0aa512dfff4ad0c3688d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b89e60e36b6e5a848c0d033d1ba3cfb1c58af2a1288bfd5612ade34b0c24e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e046a8fc92c9bbbcba4c1a83c870045a2986787c2d49ec55bd3612f8a38d34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e046a8fc92c9bbbcba4c1a83c870045a2986787c2d49ec55bd3612f8a38d34f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:28Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.376623 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:28Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.387833 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:28Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.396999 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:28Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:28 crc kubenswrapper[4792]: I0301 09:10:28.408360 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:28 crc kubenswrapper[4792]: E0301 09:10:28.408596 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.179695 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovnkube-controller/3.log" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.185199 4792 scope.go:117] "RemoveContainer" containerID="157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c" Mar 01 09:10:29 crc kubenswrapper[4792]: E0301 09:10:29.185382 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.201047 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650a1f7b-2bf5-4a55-87af-ecee46abe3ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8130bb3020d8a95b40f9675531603ec0382a75d197b88e9b13b7f3bf3c0ca5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4a35ca4af33a2e93167010c2d2d02e6ef5fa5c4f65b6b87b344b58cc0a3867\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:08:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0301 09:08:18.955011 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0301 09:08:18.956745 1 observer_polling.go:159] Starting file observer\\\\nI0301 09:08:18.958204 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0301 09:08:18.959319 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0301 09:08:42.859426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0301 09:08:48.505666 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0301 09:08:48.505786 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:08:18Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://859afd52205b95f45333b5975e67308ee360d3887aa679a922b7cba5e3acda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93e0bcca6e83da45b6dd64db0083da5cc4045d25d7a66615d09f8674cc02433\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a213f48ea7aba6dc2e3a9aa8d171acb6aa5c22532eed941bdc337e89b76589\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:29Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.219804 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:29Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.242799 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:29Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.257947 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:29Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.270837 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:29Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.291590 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://833126c3956c8927e1b68252bd9962e43df9c6e09dc2b98a20208c2db19a5fc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:10:22Z\\\",\\\"message\\\":\\\"2026-03-01T09:09:36+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f5445e48-670b-4a9c-9944-fbd0231ee556\\\\n2026-03-01T09:09:36+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f5445e48-670b-4a9c-9944-fbd0231ee556 to /host/opt/cni/bin/\\\\n2026-03-01T09:09:37Z [verbose] multus-daemon started\\\\n2026-03-01T09:09:37Z [verbose] Readiness Indicator file check\\\\n2026-03-01T09:10:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:29Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.313154 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:29Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.325364 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:29Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.342742 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a80515-bc6a-4158-8fc7-835f324381b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://584d540504344f06dee541ba7491978c79d823f7bea306efe216719f32cfbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81dff9998b1dc46b4b8cf890b0a1cf9f6201067af0aa512dfff4ad0c3688d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b89e60e36b6e5a848c0d033d1ba3cfb1c58af2a1288bfd5612ade34b0c24e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e046a8fc92c9bbbcba4c1a83c870045a2986787c2d49ec55bd3612f8a38d34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e046a8fc92c9bbbcba4c1a83c870045a2986787c2d49ec55bd3612f8a38d34f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:29Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.356280 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:29Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.374149 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:10:27Z\\\",\\\"message\\\":\\\"nityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator-webhook_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-webhook\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.254\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0301 09:10:27.273332 7207 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:10:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:29Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.391982 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddb0171-7126-45ef-aea2-8433f52357a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af07e226c7efd45ef475f2bb96192734bb29dbd1e16b83dda8a437e0a5fd606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c0585a4ca4551b3b8182c6427a8b2a5b8536cf388b7a9075819593b63502cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k7dvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:29Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.404546 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:29Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.407741 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.407779 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.407798 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:29 crc kubenswrapper[4792]: E0301 09:10:29.407934 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:29 crc kubenswrapper[4792]: E0301 09:10:29.408022 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:29 crc kubenswrapper[4792]: E0301 09:10:29.408151 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.422644 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:29Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.437288 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-frm7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0bf523-6582-46b4-9134-28880a50b474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-frm7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:29Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.453617 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:29Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:29 crc kubenswrapper[4792]: I0301 09:10:29.467601 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:29Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.408474 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:30 crc kubenswrapper[4792]: E0301 09:10:30.408598 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.502056 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.502093 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.502103 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.502138 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.502148 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:10:30Z","lastTransitionTime":"2026-03-01T09:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:10:30 crc kubenswrapper[4792]: E0301 09:10:30.514417 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:30Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.517933 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.518001 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.518013 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.518029 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.518038 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:10:30Z","lastTransitionTime":"2026-03-01T09:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:10:30 crc kubenswrapper[4792]: E0301 09:10:30.546739 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:30Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.553372 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.553396 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.553406 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.553420 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.553431 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:10:30Z","lastTransitionTime":"2026-03-01T09:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:10:30 crc kubenswrapper[4792]: E0301 09:10:30.574097 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:30Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.580521 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.580548 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.580556 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.580570 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.580580 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:10:30Z","lastTransitionTime":"2026-03-01T09:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:10:30 crc kubenswrapper[4792]: E0301 09:10:30.593665 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:30Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.596547 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.596580 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.596591 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.596604 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:10:30 crc kubenswrapper[4792]: I0301 09:10:30.596613 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:10:30Z","lastTransitionTime":"2026-03-01T09:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:10:30 crc kubenswrapper[4792]: E0301 09:10:30.607398 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"10ee72b7-c3f1-449a-bf55-34c8d2b9c7af\\\",\\\"systemUUID\\\":\\\"7013d830-7d29-4a03-853d-b832509642d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:30Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:30 crc kubenswrapper[4792]: E0301 09:10:30.607648 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.408021 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.408021 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.408130 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:31 crc kubenswrapper[4792]: E0301 09:10:31.408550 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:31 crc kubenswrapper[4792]: E0301 09:10:31.408377 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:31 crc kubenswrapper[4792]: E0301 09:10:31.408606 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.421389 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddb0171-7126-45ef-aea2-8433f52357a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0af07e226c7efd45ef475f2bb96192734bb29dbd1e16b83dda8a437e0a5fd606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c0585a4ca4551b3b8182c6427a8b2a5b8536cf388b7a9075819593b63502cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx98x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k7dvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:31Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.433665 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:31Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.446938 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9105f6b0-6f16-47aa-8009-73736a90b765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a724836f4779a4db102533ef8d07b1d8366a37e7227bb5d46ef9727303159af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bqszv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:31Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.464890 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:10:27Z\\\",\\\"message\\\":\\\"nityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator-webhook_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-webhook\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.254\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0301 09:10:27.273332 7207 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:10:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqvlh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pp7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:31Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.481240 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de0393ad-517c-4b31-9bf4-ed1a3d855bc1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:09:32Z\\\",\\\"message\\\":\\\"le observer\\\\nW0301 09:09:31.862667 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0301 09:09:31.862769 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0301 09:09:31.863373 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3937395935/tls.crt::/tmp/serving-cert-3937395935/tls.key\\\\\\\"\\\\nI0301 09:09:32.235137 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0301 09:09:32.237581 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0301 09:09:32.237596 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0301 09:09:32.237616 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0301 09:09:32.237621 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0301 09:09:32.242984 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0301 09:09:32.243010 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0301 09:09:32.243015 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243023 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0301 09:09:32.243029 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0301 09:09:32.243033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0301 09:09:32.243037 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0301 09:09:32.243041 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0301 09:09:32.245663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:31Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.499313 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f47158063d5006ef2d65e87dccd694e8082e89000497a8e90299a776b804ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://352d504c65a751f9b5ff04570627fa604d3d7c79326adaa8ead313cb4f865d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:31Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.510983 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:31Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.521204 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-frm7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa0bf523-6582-46b4-9134-28880a50b474\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw62d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-frm7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:31Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.533256 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650a1f7b-2bf5-4a55-87af-ecee46abe3ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8130bb3020d8a95b40f9675531603ec0382a75d197b88e9b13b7f3bf3c0ca5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e4a35ca4af33a2e93167010c2d2d02e6ef5fa5c4f65b6b87b344b58cc0a3867\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-01T09:08:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0301 09:08:18.955011 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0301 09:08:18.956745 1 observer_polling.go:159] Starting file observer\\\\nI0301 09:08:18.958204 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0301 09:08:18.959319 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0301 09:08:42.859426 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0301 09:08:48.505666 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0301 09:08:48.505786 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:08:18Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://859afd52205b95f45333b5975e67308ee360d3887aa679a922b7cba5e3acda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93e0bcca6e83da45b6dd64db0083da5cc4045d25d7a66615d09f8674cc02433\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a213f48ea7aba6dc2e3a9aa8d171acb6aa5c22532eed941bdc337e89b76589\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:31Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.546381 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f19b120bac2d8adff9d6c1584e5bb743cbd827287195090a2eaa57c507eb8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:31Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.558037 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zql8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0982a9bb-56d4-4e1c-86cb-76a4152de9ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb7ff70050b8bc173e206ad7c51482f8c78a94159d4dba497402c069ced283dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjzhv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zql8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:31Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.571672 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pq28p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://833126c3956c8927e1b68252bd9962e43df9c6e09dc2b98a20208c2db19a5fc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-01T09:10:22Z\\\",\\\"message\\\":\\\"2026-03-01T09:09:36+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f5445e48-670b-4a9c-9944-fbd0231ee556\\\\n2026-03-01T09:09:36+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f5445e48-670b-4a9c-9944-fbd0231ee556 to /host/opt/cni/bin/\\\\n2026-03-01T09:09:37Z [verbose] multus-daemon started\\\\n2026-03-01T09:09:37Z [verbose] Readiness Indicator file check\\\\n2026-03-01T09:10:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:10:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxpbv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pq28p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:31Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.592138 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131582d9-bd96-444b-a597-ceb81e2b2085\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67e87701fd366bbcc1401440e5be6144733e60e6e5d64b63be5ac3507947ee62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f5dabedde4fbf061824718bcefd0b4c5d37897065ad0e180d3cd55e81bb0f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069e47ce2e6a86333e349380edd69bc9eaba61f0e1d34ba970cc3b6cba437898\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf822fb0d92281f211c6ff1de9e7ab692101d44ad155356b34c8396564c73f43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c258aeaf071d7c98fd43a936c39ecf7de25dcc3329d3e0f32b3ba6d5722b087\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce2731b298761193136bbbd9bf7fb8f6129e601c50902417ee8bc17aac311301\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b3d5ce77f7b8b7d59d966459c597f3569d3825f3a10cba30f3b95c506405a4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvrws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rbwx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:31Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.603286 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gj45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5923c286-5572-46c3-bed5-79cd67efc945\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2bbfe28aa0bf13501298bd82fef334ef0d63f12b4c8727f08ed8dc34d687ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntx44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:09:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gj45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:31Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.614539 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7a80515-bc6a-4158-8fc7-835f324381b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-01T09:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://584d540504344f06dee541ba7491978c79d823f7bea306efe216719f32cfbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81dff9998b1dc46b4b8cf890b0a1cf9f6201067af0aa512dfff4ad0c3688d69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b89e60e36b6e5a848c0d033d1ba3cfb1c58af2a1288bfd5612ade34b0c24e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e046a8fc92c9bbbcba4c1a83c870045a2986787c2d49ec55bd3612f8a38d34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e046a8fc92c9bbbcba4c1a83c870045a2986787c2d49ec55bd3612f8a38d34f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-01T09:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-01T09:07:52Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-01T09:07:51Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:31Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.626632 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:31Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:31 crc kubenswrapper[4792]: I0301 09:10:31.638132 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-01T09:09:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c053a8e6f3422b9d56c476720e0af0dad7f9b6b98ce47ecf56a492f58ec2ba8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-01T09:09:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-01T09:10:31Z is after 2025-08-24T17:21:41Z" Mar 01 09:10:31 crc kubenswrapper[4792]: E0301 09:10:31.765143 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:10:32 crc kubenswrapper[4792]: I0301 09:10:32.408431 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:32 crc kubenswrapper[4792]: E0301 09:10:32.408642 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:33 crc kubenswrapper[4792]: I0301 09:10:33.407768 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:33 crc kubenswrapper[4792]: E0301 09:10:33.407897 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:33 crc kubenswrapper[4792]: I0301 09:10:33.408400 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:33 crc kubenswrapper[4792]: I0301 09:10:33.408528 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:33 crc kubenswrapper[4792]: E0301 09:10:33.408894 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:33 crc kubenswrapper[4792]: E0301 09:10:33.409392 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:33 crc kubenswrapper[4792]: I0301 09:10:33.409489 4792 scope.go:117] "RemoveContainer" containerID="40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2" Mar 01 09:10:33 crc kubenswrapper[4792]: E0301 09:10:33.410381 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:10:34 crc kubenswrapper[4792]: I0301 09:10:34.408452 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:34 crc kubenswrapper[4792]: E0301 09:10:34.408585 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:34 crc kubenswrapper[4792]: I0301 09:10:34.418529 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 01 09:10:35 crc kubenswrapper[4792]: I0301 09:10:35.409013 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:35 crc kubenswrapper[4792]: I0301 09:10:35.409007 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:35 crc kubenswrapper[4792]: I0301 09:10:35.409117 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:35 crc kubenswrapper[4792]: E0301 09:10:35.409331 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:35 crc kubenswrapper[4792]: E0301 09:10:35.409417 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:35 crc kubenswrapper[4792]: E0301 09:10:35.409508 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:36 crc kubenswrapper[4792]: I0301 09:10:36.408758 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:36 crc kubenswrapper[4792]: E0301 09:10:36.410031 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:36 crc kubenswrapper[4792]: E0301 09:10:36.766497 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:10:37 crc kubenswrapper[4792]: I0301 09:10:37.408399 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:37 crc kubenswrapper[4792]: I0301 09:10:37.408397 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:37 crc kubenswrapper[4792]: E0301 09:10:37.408603 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:37 crc kubenswrapper[4792]: I0301 09:10:37.408757 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:37 crc kubenswrapper[4792]: E0301 09:10:37.408930 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:37 crc kubenswrapper[4792]: E0301 09:10:37.409045 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:37 crc kubenswrapper[4792]: I0301 09:10:37.769298 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:10:37 crc kubenswrapper[4792]: I0301 09:10:37.769404 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:37 crc kubenswrapper[4792]: I0301 09:10:37.769495 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:37 crc kubenswrapper[4792]: E0301 09:10:37.769529 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:41.769498696 +0000 UTC m=+231.011377933 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:10:37 crc kubenswrapper[4792]: E0301 09:10:37.769567 4792 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 01 09:10:37 crc kubenswrapper[4792]: I0301 09:10:37.769584 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:37 crc kubenswrapper[4792]: E0301 09:10:37.769612 4792 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 01 09:10:37 crc kubenswrapper[4792]: E0301 09:10:37.769628 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-01 09:11:41.769609598 +0000 UTC m=+231.011488895 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 01 09:10:37 crc kubenswrapper[4792]: E0301 09:10:37.769739 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 01 09:10:37 crc kubenswrapper[4792]: E0301 09:10:37.769758 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 01 09:10:37 crc kubenswrapper[4792]: E0301 09:10:37.769772 4792 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:10:37 crc kubenswrapper[4792]: E0301 09:10:37.769787 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 01 09:10:37 crc kubenswrapper[4792]: E0301 09:10:37.769805 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 01 09:10:37 crc kubenswrapper[4792]: E0301 09:10:37.769814 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-01 09:11:41.769803912 +0000 UTC m=+231.011683209 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:10:37 crc kubenswrapper[4792]: E0301 09:10:37.769818 4792 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:10:37 crc kubenswrapper[4792]: E0301 09:10:37.769844 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-01 09:11:41.769822163 +0000 UTC m=+231.011701360 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 01 09:10:37 crc kubenswrapper[4792]: I0301 09:10:37.769697 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:37 crc kubenswrapper[4792]: E0301 09:10:37.769867 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-01 09:11:41.769859103 +0000 UTC m=+231.011738300 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 01 09:10:38 crc kubenswrapper[4792]: I0301 09:10:38.408197 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:38 crc kubenswrapper[4792]: E0301 09:10:38.408339 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:39 crc kubenswrapper[4792]: I0301 09:10:39.408752 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:39 crc kubenswrapper[4792]: I0301 09:10:39.408826 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:39 crc kubenswrapper[4792]: I0301 09:10:39.408825 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:39 crc kubenswrapper[4792]: E0301 09:10:39.408956 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:39 crc kubenswrapper[4792]: E0301 09:10:39.409074 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:39 crc kubenswrapper[4792]: E0301 09:10:39.409625 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:39 crc kubenswrapper[4792]: I0301 09:10:39.409949 4792 scope.go:117] "RemoveContainer" containerID="157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c" Mar 01 09:10:39 crc kubenswrapper[4792]: E0301 09:10:39.410144 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.408321 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:40 crc kubenswrapper[4792]: E0301 09:10:40.408498 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.653778 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.653851 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.653882 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.653939 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.653954 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-01T09:10:40Z","lastTransitionTime":"2026-03-01T09:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.707928 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk"] Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.708483 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.711299 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.711306 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.711503 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.711524 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.754568 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=33.754537164 podStartE2EDuration="33.754537164s" podCreationTimestamp="2026-03-01 09:10:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:10:40.73469428 +0000 UTC m=+169.976573517" watchObservedRunningTime="2026-03-01 09:10:40.754537164 +0000 UTC m=+169.996416401" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.793341 4792 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.794871 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-zql8j" podStartSLOduration=109.794852286 podStartE2EDuration="1m49.794852286s" podCreationTimestamp="2026-03-01 09:08:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:10:40.793693252 +0000 UTC m=+170.035572449" watchObservedRunningTime="2026-03-01 09:10:40.794852286 +0000 UTC m=+170.036731483" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.801006 4792 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.804497 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4c63f6a-1c48-4f27-8687-b7be8c24fcb9-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-k5kmk\" (UID: \"f4c63f6a-1c48-4f27-8687-b7be8c24fcb9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.804544 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f4c63f6a-1c48-4f27-8687-b7be8c24fcb9-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-k5kmk\" (UID: \"f4c63f6a-1c48-4f27-8687-b7be8c24fcb9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.804629 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f4c63f6a-1c48-4f27-8687-b7be8c24fcb9-service-ca\") pod \"cluster-version-operator-5c965bbfc6-k5kmk\" (UID: \"f4c63f6a-1c48-4f27-8687-b7be8c24fcb9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.804657 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f4c63f6a-1c48-4f27-8687-b7be8c24fcb9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-k5kmk\" (UID: \"f4c63f6a-1c48-4f27-8687-b7be8c24fcb9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.804677 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4c63f6a-1c48-4f27-8687-b7be8c24fcb9-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-k5kmk\" (UID: \"f4c63f6a-1c48-4f27-8687-b7be8c24fcb9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.826573 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-rbwx8" podStartSLOduration=108.826553169 podStartE2EDuration="1m48.826553169s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:10:40.826437076 +0000 UTC m=+170.068316263" watchObservedRunningTime="2026-03-01 09:10:40.826553169 +0000 UTC m=+170.068432386" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.827198 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-pq28p" podStartSLOduration=108.827188872 podStartE2EDuration="1m48.827188872s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:10:40.808839669 +0000 UTC m=+170.050718886" watchObservedRunningTime="2026-03-01 09:10:40.827188872 +0000 UTC m=+170.069068079" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.840994 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4gj45" podStartSLOduration=109.84097686 podStartE2EDuration="1m49.84097686s" podCreationTimestamp="2026-03-01 09:08:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:10:40.84049673 +0000 UTC m=+170.082375937" watchObservedRunningTime="2026-03-01 09:10:40.84097686 +0000 UTC m=+170.082856067" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.863885 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=35.863867858 podStartE2EDuration="35.863867858s" podCreationTimestamp="2026-03-01 09:10:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:10:40.85293689 +0000 UTC m=+170.094816107" watchObservedRunningTime="2026-03-01 09:10:40.863867858 +0000 UTC m=+170.105747065" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.906070 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f4c63f6a-1c48-4f27-8687-b7be8c24fcb9-service-ca\") pod \"cluster-version-operator-5c965bbfc6-k5kmk\" (UID: \"f4c63f6a-1c48-4f27-8687-b7be8c24fcb9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.906115 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f4c63f6a-1c48-4f27-8687-b7be8c24fcb9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-k5kmk\" (UID: \"f4c63f6a-1c48-4f27-8687-b7be8c24fcb9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.906150 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4c63f6a-1c48-4f27-8687-b7be8c24fcb9-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-k5kmk\" (UID: \"f4c63f6a-1c48-4f27-8687-b7be8c24fcb9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.906188 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4c63f6a-1c48-4f27-8687-b7be8c24fcb9-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-k5kmk\" (UID: \"f4c63f6a-1c48-4f27-8687-b7be8c24fcb9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.906215 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f4c63f6a-1c48-4f27-8687-b7be8c24fcb9-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-k5kmk\" (UID: \"f4c63f6a-1c48-4f27-8687-b7be8c24fcb9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.906278 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f4c63f6a-1c48-4f27-8687-b7be8c24fcb9-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-k5kmk\" (UID: \"f4c63f6a-1c48-4f27-8687-b7be8c24fcb9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.906415 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f4c63f6a-1c48-4f27-8687-b7be8c24fcb9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-k5kmk\" (UID: \"f4c63f6a-1c48-4f27-8687-b7be8c24fcb9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.907066 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f4c63f6a-1c48-4f27-8687-b7be8c24fcb9-service-ca\") pod \"cluster-version-operator-5c965bbfc6-k5kmk\" (UID: \"f4c63f6a-1c48-4f27-8687-b7be8c24fcb9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.907717 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podStartSLOduration=108.907707094 podStartE2EDuration="1m48.907707094s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:10:40.879241609 +0000 UTC m=+170.121120816" watchObservedRunningTime="2026-03-01 09:10:40.907707094 +0000 UTC m=+170.149586291" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.914288 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4c63f6a-1c48-4f27-8687-b7be8c24fcb9-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-k5kmk\" (UID: \"f4c63f6a-1c48-4f27-8687-b7be8c24fcb9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.930311 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k7dvn" podStartSLOduration=108.930294086 podStartE2EDuration="1m48.930294086s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:10:40.928789414 +0000 UTC m=+170.170668611" watchObservedRunningTime="2026-03-01 09:10:40.930294086 +0000 UTC m=+170.172173283" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.931829 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4c63f6a-1c48-4f27-8687-b7be8c24fcb9-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-k5kmk\" (UID: \"f4c63f6a-1c48-4f27-8687-b7be8c24fcb9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" Mar 01 09:10:40 crc kubenswrapper[4792]: I0301 09:10:40.958296 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=6.95826853 podStartE2EDuration="6.95826853s" podCreationTimestamp="2026-03-01 09:10:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:10:40.947616098 +0000 UTC m=+170.189495295" watchObservedRunningTime="2026-03-01 09:10:40.95826853 +0000 UTC m=+170.200147727" Mar 01 09:10:41 crc kubenswrapper[4792]: I0301 09:10:41.026652 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" Mar 01 09:10:41 crc kubenswrapper[4792]: W0301 09:10:41.044148 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4c63f6a_1c48_4f27_8687_b7be8c24fcb9.slice/crio-f41d2c318ff605f83c38e7b35e1f9361796bdba7a0e6aa5a26fd9330f35552a4 WatchSource:0}: Error finding container f41d2c318ff605f83c38e7b35e1f9361796bdba7a0e6aa5a26fd9330f35552a4: Status 404 returned error can't find the container with id f41d2c318ff605f83c38e7b35e1f9361796bdba7a0e6aa5a26fd9330f35552a4 Mar 01 09:10:41 crc kubenswrapper[4792]: I0301 09:10:41.224897 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" event={"ID":"f4c63f6a-1c48-4f27-8687-b7be8c24fcb9","Type":"ContainerStarted","Data":"5569e4527c708368f798d693e286f5fd93cc26ec2fc46b0ee1fda4c9120d9d05"} Mar 01 09:10:41 crc kubenswrapper[4792]: I0301 09:10:41.224966 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" event={"ID":"f4c63f6a-1c48-4f27-8687-b7be8c24fcb9","Type":"ContainerStarted","Data":"f41d2c318ff605f83c38e7b35e1f9361796bdba7a0e6aa5a26fd9330f35552a4"} Mar 01 09:10:41 crc kubenswrapper[4792]: I0301 09:10:41.237776 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-k5kmk" podStartSLOduration=110.237759249 podStartE2EDuration="1m50.237759249s" podCreationTimestamp="2026-03-01 09:08:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:10:41.236968012 +0000 UTC m=+170.478847209" watchObservedRunningTime="2026-03-01 09:10:41.237759249 +0000 UTC m=+170.479638446" Mar 01 09:10:41 crc kubenswrapper[4792]: I0301 09:10:41.408542 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:41 crc kubenswrapper[4792]: I0301 09:10:41.408684 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:41 crc kubenswrapper[4792]: E0301 09:10:41.409299 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:41 crc kubenswrapper[4792]: I0301 09:10:41.409323 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:41 crc kubenswrapper[4792]: E0301 09:10:41.409443 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:41 crc kubenswrapper[4792]: E0301 09:10:41.409654 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:41 crc kubenswrapper[4792]: E0301 09:10:41.768160 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:10:42 crc kubenswrapper[4792]: I0301 09:10:42.407756 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:42 crc kubenswrapper[4792]: E0301 09:10:42.407976 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:43 crc kubenswrapper[4792]: I0301 09:10:43.408127 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:43 crc kubenswrapper[4792]: I0301 09:10:43.408196 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:43 crc kubenswrapper[4792]: I0301 09:10:43.408145 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:43 crc kubenswrapper[4792]: E0301 09:10:43.408281 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:43 crc kubenswrapper[4792]: E0301 09:10:43.408461 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:43 crc kubenswrapper[4792]: E0301 09:10:43.408501 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:44 crc kubenswrapper[4792]: I0301 09:10:44.407925 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:44 crc kubenswrapper[4792]: E0301 09:10:44.408061 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:45 crc kubenswrapper[4792]: I0301 09:10:45.409595 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:45 crc kubenswrapper[4792]: E0301 09:10:45.409744 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:45 crc kubenswrapper[4792]: I0301 09:10:45.410001 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:45 crc kubenswrapper[4792]: E0301 09:10:45.410105 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:45 crc kubenswrapper[4792]: I0301 09:10:45.409554 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:45 crc kubenswrapper[4792]: E0301 09:10:45.412056 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:46 crc kubenswrapper[4792]: I0301 09:10:46.408394 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:46 crc kubenswrapper[4792]: E0301 09:10:46.408566 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:46 crc kubenswrapper[4792]: I0301 09:10:46.409382 4792 scope.go:117] "RemoveContainer" containerID="40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2" Mar 01 09:10:46 crc kubenswrapper[4792]: E0301 09:10:46.409776 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 01 09:10:46 crc kubenswrapper[4792]: E0301 09:10:46.770303 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:10:47 crc kubenswrapper[4792]: I0301 09:10:47.408340 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:47 crc kubenswrapper[4792]: I0301 09:10:47.408399 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:47 crc kubenswrapper[4792]: E0301 09:10:47.408477 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:47 crc kubenswrapper[4792]: I0301 09:10:47.408399 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:47 crc kubenswrapper[4792]: E0301 09:10:47.408552 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:47 crc kubenswrapper[4792]: E0301 09:10:47.408577 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:48 crc kubenswrapper[4792]: I0301 09:10:48.408220 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:48 crc kubenswrapper[4792]: E0301 09:10:48.408612 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:48 crc kubenswrapper[4792]: I0301 09:10:48.426677 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 01 09:10:49 crc kubenswrapper[4792]: I0301 09:10:49.408131 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:49 crc kubenswrapper[4792]: I0301 09:10:49.408143 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:49 crc kubenswrapper[4792]: E0301 09:10:49.408332 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:49 crc kubenswrapper[4792]: E0301 09:10:49.408427 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:49 crc kubenswrapper[4792]: I0301 09:10:49.408159 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:49 crc kubenswrapper[4792]: E0301 09:10:49.408506 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:50 crc kubenswrapper[4792]: I0301 09:10:50.408784 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:50 crc kubenswrapper[4792]: E0301 09:10:50.409069 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:51 crc kubenswrapper[4792]: I0301 09:10:51.409938 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:51 crc kubenswrapper[4792]: I0301 09:10:51.410037 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:51 crc kubenswrapper[4792]: I0301 09:10:51.410334 4792 scope.go:117] "RemoveContainer" containerID="157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c" Mar 01 09:10:51 crc kubenswrapper[4792]: E0301 09:10:51.410297 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:51 crc kubenswrapper[4792]: I0301 09:10:51.410048 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:51 crc kubenswrapper[4792]: E0301 09:10:51.410437 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:51 crc kubenswrapper[4792]: E0301 09:10:51.410472 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" Mar 01 09:10:51 crc kubenswrapper[4792]: E0301 09:10:51.410508 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:51 crc kubenswrapper[4792]: I0301 09:10:51.437106 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=3.437084381 podStartE2EDuration="3.437084381s" podCreationTimestamp="2026-03-01 09:10:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:10:51.435730982 +0000 UTC m=+180.677610199" watchObservedRunningTime="2026-03-01 09:10:51.437084381 +0000 UTC m=+180.678963698" Mar 01 09:10:51 crc kubenswrapper[4792]: E0301 09:10:51.771317 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:10:52 crc kubenswrapper[4792]: I0301 09:10:52.408489 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:52 crc kubenswrapper[4792]: E0301 09:10:52.408681 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:52 crc kubenswrapper[4792]: I0301 09:10:52.713510 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs\") pod \"network-metrics-daemon-frm7z\" (UID: \"fa0bf523-6582-46b4-9134-28880a50b474\") " pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:52 crc kubenswrapper[4792]: E0301 09:10:52.714512 4792 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 01 09:10:52 crc kubenswrapper[4792]: E0301 09:10:52.714589 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs podName:fa0bf523-6582-46b4-9134-28880a50b474 nodeName:}" failed. No retries permitted until 2026-03-01 09:11:56.714569037 +0000 UTC m=+245.956448244 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs") pod "network-metrics-daemon-frm7z" (UID: "fa0bf523-6582-46b4-9134-28880a50b474") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 01 09:10:53 crc kubenswrapper[4792]: I0301 09:10:53.407711 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:53 crc kubenswrapper[4792]: E0301 09:10:53.407844 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:53 crc kubenswrapper[4792]: I0301 09:10:53.407859 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:53 crc kubenswrapper[4792]: I0301 09:10:53.407726 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:53 crc kubenswrapper[4792]: E0301 09:10:53.407993 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:53 crc kubenswrapper[4792]: E0301 09:10:53.408069 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:54 crc kubenswrapper[4792]: I0301 09:10:54.408606 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:54 crc kubenswrapper[4792]: E0301 09:10:54.408734 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:55 crc kubenswrapper[4792]: I0301 09:10:55.408810 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:55 crc kubenswrapper[4792]: I0301 09:10:55.408843 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:55 crc kubenswrapper[4792]: E0301 09:10:55.409064 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:55 crc kubenswrapper[4792]: I0301 09:10:55.409219 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:55 crc kubenswrapper[4792]: E0301 09:10:55.409470 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:55 crc kubenswrapper[4792]: E0301 09:10:55.409867 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:56 crc kubenswrapper[4792]: I0301 09:10:56.407890 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:56 crc kubenswrapper[4792]: E0301 09:10:56.408325 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:56 crc kubenswrapper[4792]: E0301 09:10:56.773167 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:10:57 crc kubenswrapper[4792]: I0301 09:10:57.408796 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:57 crc kubenswrapper[4792]: E0301 09:10:57.408994 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:57 crc kubenswrapper[4792]: I0301 09:10:57.409312 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:57 crc kubenswrapper[4792]: E0301 09:10:57.409375 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:10:57 crc kubenswrapper[4792]: I0301 09:10:57.409489 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:57 crc kubenswrapper[4792]: E0301 09:10:57.409641 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:57 crc kubenswrapper[4792]: I0301 09:10:57.409997 4792 scope.go:117] "RemoveContainer" containerID="40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2" Mar 01 09:10:58 crc kubenswrapper[4792]: I0301 09:10:58.278595 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 01 09:10:58 crc kubenswrapper[4792]: I0301 09:10:58.281639 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4c38215135d7b0d135f95361664363266a76db5a039fae95c1e2507e52ee9f40"} Mar 01 09:10:58 crc kubenswrapper[4792]: I0301 09:10:58.283098 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:10:58 crc kubenswrapper[4792]: I0301 09:10:58.408047 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:10:58 crc kubenswrapper[4792]: E0301 09:10:58.408184 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:10:59 crc kubenswrapper[4792]: I0301 09:10:59.408124 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:10:59 crc kubenswrapper[4792]: I0301 09:10:59.408184 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:10:59 crc kubenswrapper[4792]: I0301 09:10:59.408265 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:10:59 crc kubenswrapper[4792]: E0301 09:10:59.408325 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:10:59 crc kubenswrapper[4792]: E0301 09:10:59.408257 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:10:59 crc kubenswrapper[4792]: E0301 09:10:59.408441 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:11:00 crc kubenswrapper[4792]: I0301 09:11:00.408673 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:11:00 crc kubenswrapper[4792]: E0301 09:11:00.408784 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:11:01 crc kubenswrapper[4792]: I0301 09:11:01.408283 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:11:01 crc kubenswrapper[4792]: I0301 09:11:01.408394 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:01 crc kubenswrapper[4792]: I0301 09:11:01.409812 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:11:01 crc kubenswrapper[4792]: E0301 09:11:01.409799 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:11:01 crc kubenswrapper[4792]: E0301 09:11:01.410042 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:11:01 crc kubenswrapper[4792]: E0301 09:11:01.410110 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:11:01 crc kubenswrapper[4792]: E0301 09:11:01.774581 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:11:02 crc kubenswrapper[4792]: I0301 09:11:02.407854 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:11:02 crc kubenswrapper[4792]: E0301 09:11:02.408083 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:11:03 crc kubenswrapper[4792]: I0301 09:11:03.408705 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:03 crc kubenswrapper[4792]: E0301 09:11:03.408850 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:11:03 crc kubenswrapper[4792]: I0301 09:11:03.409044 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:11:03 crc kubenswrapper[4792]: E0301 09:11:03.409147 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:11:03 crc kubenswrapper[4792]: I0301 09:11:03.409213 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:11:03 crc kubenswrapper[4792]: E0301 09:11:03.409611 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:11:03 crc kubenswrapper[4792]: I0301 09:11:03.409951 4792 scope.go:117] "RemoveContainer" containerID="157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c" Mar 01 09:11:03 crc kubenswrapper[4792]: E0301 09:11:03.410204 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7pp7m_openshift-ovn-kubernetes(e2bd7bac-21cf-4657-ab84-68a14f99f8f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" Mar 01 09:11:04 crc kubenswrapper[4792]: I0301 09:11:04.408710 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:11:04 crc kubenswrapper[4792]: E0301 09:11:04.409133 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:11:05 crc kubenswrapper[4792]: I0301 09:11:05.408342 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:11:05 crc kubenswrapper[4792]: E0301 09:11:05.408807 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:11:05 crc kubenswrapper[4792]: I0301 09:11:05.408518 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:05 crc kubenswrapper[4792]: E0301 09:11:05.409262 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:11:05 crc kubenswrapper[4792]: I0301 09:11:05.408384 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:11:05 crc kubenswrapper[4792]: E0301 09:11:05.410305 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:11:06 crc kubenswrapper[4792]: I0301 09:11:06.408605 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:11:06 crc kubenswrapper[4792]: E0301 09:11:06.408778 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:11:06 crc kubenswrapper[4792]: E0301 09:11:06.776193 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:11:07 crc kubenswrapper[4792]: I0301 09:11:07.407954 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:11:07 crc kubenswrapper[4792]: I0301 09:11:07.407996 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:11:07 crc kubenswrapper[4792]: E0301 09:11:07.408128 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:11:07 crc kubenswrapper[4792]: I0301 09:11:07.408269 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:07 crc kubenswrapper[4792]: E0301 09:11:07.408416 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:11:07 crc kubenswrapper[4792]: E0301 09:11:07.408507 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:11:08 crc kubenswrapper[4792]: I0301 09:11:08.408275 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:11:08 crc kubenswrapper[4792]: E0301 09:11:08.408588 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:11:09 crc kubenswrapper[4792]: I0301 09:11:09.321063 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pq28p_ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3/kube-multus/1.log" Mar 01 09:11:09 crc kubenswrapper[4792]: I0301 09:11:09.322174 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pq28p_ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3/kube-multus/0.log" Mar 01 09:11:09 crc kubenswrapper[4792]: I0301 09:11:09.322308 4792 generic.go:334] "Generic (PLEG): container finished" podID="ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3" containerID="833126c3956c8927e1b68252bd9962e43df9c6e09dc2b98a20208c2db19a5fc1" exitCode=1 Mar 01 09:11:09 crc kubenswrapper[4792]: I0301 09:11:09.322355 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pq28p" event={"ID":"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3","Type":"ContainerDied","Data":"833126c3956c8927e1b68252bd9962e43df9c6e09dc2b98a20208c2db19a5fc1"} Mar 01 09:11:09 crc kubenswrapper[4792]: I0301 09:11:09.322405 4792 scope.go:117] "RemoveContainer" containerID="239da7b9dbba3d059f0f694f15db1b15a21917053c2ed3ac36c4be78c84a2dae" Mar 01 09:11:09 crc kubenswrapper[4792]: I0301 09:11:09.322819 4792 scope.go:117] "RemoveContainer" containerID="833126c3956c8927e1b68252bd9962e43df9c6e09dc2b98a20208c2db19a5fc1" Mar 01 09:11:09 crc kubenswrapper[4792]: E0301 09:11:09.323061 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-pq28p_openshift-multus(ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3)\"" pod="openshift-multus/multus-pq28p" podUID="ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3" Mar 01 09:11:09 crc kubenswrapper[4792]: I0301 09:11:09.341782 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=96.341757252 podStartE2EDuration="1m36.341757252s" podCreationTimestamp="2026-03-01 09:09:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:10:58.316163183 +0000 UTC m=+187.558042450" watchObservedRunningTime="2026-03-01 09:11:09.341757252 +0000 UTC m=+198.583636449" Mar 01 09:11:09 crc kubenswrapper[4792]: I0301 09:11:09.408465 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:11:09 crc kubenswrapper[4792]: I0301 09:11:09.408491 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:11:09 crc kubenswrapper[4792]: I0301 09:11:09.408519 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:09 crc kubenswrapper[4792]: E0301 09:11:09.408613 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:11:09 crc kubenswrapper[4792]: E0301 09:11:09.408682 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:11:09 crc kubenswrapper[4792]: E0301 09:11:09.408752 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:11:10 crc kubenswrapper[4792]: I0301 09:11:10.326293 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pq28p_ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3/kube-multus/1.log" Mar 01 09:11:10 crc kubenswrapper[4792]: I0301 09:11:10.407739 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:11:10 crc kubenswrapper[4792]: E0301 09:11:10.407876 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:11:11 crc kubenswrapper[4792]: I0301 09:11:11.408137 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:11 crc kubenswrapper[4792]: I0301 09:11:11.408343 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:11:11 crc kubenswrapper[4792]: E0301 09:11:11.408333 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:11:11 crc kubenswrapper[4792]: I0301 09:11:11.408408 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:11:11 crc kubenswrapper[4792]: E0301 09:11:11.408511 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:11:11 crc kubenswrapper[4792]: E0301 09:11:11.409498 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:11:11 crc kubenswrapper[4792]: E0301 09:11:11.776963 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:11:12 crc kubenswrapper[4792]: I0301 09:11:12.208370 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:11:12 crc kubenswrapper[4792]: I0301 09:11:12.408207 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:11:12 crc kubenswrapper[4792]: E0301 09:11:12.408328 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:11:13 crc kubenswrapper[4792]: I0301 09:11:13.407832 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:11:13 crc kubenswrapper[4792]: E0301 09:11:13.407992 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:11:13 crc kubenswrapper[4792]: I0301 09:11:13.408060 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:11:13 crc kubenswrapper[4792]: I0301 09:11:13.408080 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:13 crc kubenswrapper[4792]: E0301 09:11:13.408219 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:11:13 crc kubenswrapper[4792]: E0301 09:11:13.408297 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:11:14 crc kubenswrapper[4792]: I0301 09:11:14.407995 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:11:14 crc kubenswrapper[4792]: E0301 09:11:14.408143 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:11:15 crc kubenswrapper[4792]: I0301 09:11:15.408754 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:15 crc kubenswrapper[4792]: E0301 09:11:15.408947 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:11:15 crc kubenswrapper[4792]: I0301 09:11:15.409214 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:11:15 crc kubenswrapper[4792]: E0301 09:11:15.409300 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:11:15 crc kubenswrapper[4792]: I0301 09:11:15.409566 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:11:15 crc kubenswrapper[4792]: E0301 09:11:15.409655 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:11:16 crc kubenswrapper[4792]: I0301 09:11:16.408691 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:11:16 crc kubenswrapper[4792]: E0301 09:11:16.408806 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:11:16 crc kubenswrapper[4792]: E0301 09:11:16.778183 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:11:17 crc kubenswrapper[4792]: I0301 09:11:17.408085 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:11:17 crc kubenswrapper[4792]: I0301 09:11:17.408129 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:11:17 crc kubenswrapper[4792]: E0301 09:11:17.408208 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:11:17 crc kubenswrapper[4792]: I0301 09:11:17.408220 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:17 crc kubenswrapper[4792]: E0301 09:11:17.408510 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:11:17 crc kubenswrapper[4792]: E0301 09:11:17.408626 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:11:18 crc kubenswrapper[4792]: I0301 09:11:18.408061 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:11:18 crc kubenswrapper[4792]: E0301 09:11:18.408860 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:11:18 crc kubenswrapper[4792]: I0301 09:11:18.409400 4792 scope.go:117] "RemoveContainer" containerID="157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c" Mar 01 09:11:19 crc kubenswrapper[4792]: I0301 09:11:19.357878 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovnkube-controller/3.log" Mar 01 09:11:19 crc kubenswrapper[4792]: I0301 09:11:19.360075 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerStarted","Data":"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788"} Mar 01 09:11:19 crc kubenswrapper[4792]: I0301 09:11:19.360476 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:11:19 crc kubenswrapper[4792]: I0301 09:11:19.383316 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podStartSLOduration=147.383301273 podStartE2EDuration="2m27.383301273s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:19.382583375 +0000 UTC m=+208.624462572" watchObservedRunningTime="2026-03-01 09:11:19.383301273 +0000 UTC m=+208.625180470" Mar 01 09:11:19 crc kubenswrapper[4792]: I0301 09:11:19.410768 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:11:19 crc kubenswrapper[4792]: E0301 09:11:19.410887 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:11:19 crc kubenswrapper[4792]: I0301 09:11:19.411086 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:19 crc kubenswrapper[4792]: E0301 09:11:19.411144 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:11:19 crc kubenswrapper[4792]: I0301 09:11:19.411774 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:11:19 crc kubenswrapper[4792]: E0301 09:11:19.411849 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:11:19 crc kubenswrapper[4792]: I0301 09:11:19.520169 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-frm7z"] Mar 01 09:11:19 crc kubenswrapper[4792]: I0301 09:11:19.520294 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:11:19 crc kubenswrapper[4792]: E0301 09:11:19.520382 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:11:21 crc kubenswrapper[4792]: I0301 09:11:21.407934 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:21 crc kubenswrapper[4792]: I0301 09:11:21.407939 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:11:21 crc kubenswrapper[4792]: I0301 09:11:21.408006 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:11:21 crc kubenswrapper[4792]: I0301 09:11:21.408023 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:11:21 crc kubenswrapper[4792]: E0301 09:11:21.408952 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:11:21 crc kubenswrapper[4792]: E0301 09:11:21.409040 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:11:21 crc kubenswrapper[4792]: E0301 09:11:21.409134 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:11:21 crc kubenswrapper[4792]: E0301 09:11:21.409220 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:11:21 crc kubenswrapper[4792]: E0301 09:11:21.779287 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:11:23 crc kubenswrapper[4792]: I0301 09:11:23.408575 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:23 crc kubenswrapper[4792]: E0301 09:11:23.408747 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:11:23 crc kubenswrapper[4792]: I0301 09:11:23.408825 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:11:23 crc kubenswrapper[4792]: I0301 09:11:23.408934 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:11:23 crc kubenswrapper[4792]: E0301 09:11:23.409075 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:11:23 crc kubenswrapper[4792]: I0301 09:11:23.409094 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:11:23 crc kubenswrapper[4792]: E0301 09:11:23.409239 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:11:23 crc kubenswrapper[4792]: I0301 09:11:23.409375 4792 scope.go:117] "RemoveContainer" containerID="833126c3956c8927e1b68252bd9962e43df9c6e09dc2b98a20208c2db19a5fc1" Mar 01 09:11:23 crc kubenswrapper[4792]: E0301 09:11:23.409438 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:11:24 crc kubenswrapper[4792]: I0301 09:11:24.378409 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pq28p_ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3/kube-multus/1.log" Mar 01 09:11:24 crc kubenswrapper[4792]: I0301 09:11:24.378506 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pq28p" event={"ID":"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3","Type":"ContainerStarted","Data":"43e6bc214d9cf4f260a6b3f92b2bb7a5207d98f68c3fc275ce08d07a9684d65b"} Mar 01 09:11:25 crc kubenswrapper[4792]: I0301 09:11:25.407970 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:11:25 crc kubenswrapper[4792]: I0301 09:11:25.408002 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:11:25 crc kubenswrapper[4792]: I0301 09:11:25.408086 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:25 crc kubenswrapper[4792]: E0301 09:11:25.408235 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 01 09:11:25 crc kubenswrapper[4792]: E0301 09:11:25.408384 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-frm7z" podUID="fa0bf523-6582-46b4-9134-28880a50b474" Mar 01 09:11:25 crc kubenswrapper[4792]: I0301 09:11:25.408409 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:11:25 crc kubenswrapper[4792]: E0301 09:11:25.408449 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 01 09:11:25 crc kubenswrapper[4792]: E0301 09:11:25.408881 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 01 09:11:27 crc kubenswrapper[4792]: I0301 09:11:27.408250 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:27 crc kubenswrapper[4792]: I0301 09:11:27.408289 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:11:27 crc kubenswrapper[4792]: I0301 09:11:27.408292 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:11:27 crc kubenswrapper[4792]: I0301 09:11:27.408330 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:11:27 crc kubenswrapper[4792]: I0301 09:11:27.414264 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 01 09:11:27 crc kubenswrapper[4792]: I0301 09:11:27.414536 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 01 09:11:27 crc kubenswrapper[4792]: I0301 09:11:27.414683 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 01 09:11:27 crc kubenswrapper[4792]: I0301 09:11:27.414766 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 01 09:11:27 crc kubenswrapper[4792]: I0301 09:11:27.415176 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 01 09:11:27 crc kubenswrapper[4792]: I0301 09:11:27.415185 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.618074 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.658599 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.659219 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.669132 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.669754 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.669753 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.670726 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.670887 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.670773 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.671241 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-wxl8v"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.671957 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-wxl8v" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.674214 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5kp"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.674874 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-zrzcg"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.675349 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.675881 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5kp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.677164 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-6lk5b"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.682362 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.685007 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.685687 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s68hg"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.686029 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.686177 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s68hg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.686586 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zk9bv"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.694572 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zk9bv" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.706553 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wl9zt"] Mar 01 09:11:31 crc kubenswrapper[4792]: W0301 09:11:31.707795 4792 reflector.go:561] object-"openshift-apiserver"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 01 09:11:31 crc kubenswrapper[4792]: E0301 09:11:31.708015 4792 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 01 09:11:31 crc kubenswrapper[4792]: W0301 09:11:31.719111 4792 reflector.go:561] object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff": failed to list *v1.Secret: secrets "openshift-apiserver-sa-dockercfg-djjff" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 01 09:11:31 crc kubenswrapper[4792]: E0301 09:11:31.719174 4792 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"openshift-apiserver-sa-dockercfg-djjff\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-apiserver-sa-dockercfg-djjff\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 01 09:11:31 crc kubenswrapper[4792]: W0301 09:11:31.719467 4792 reflector.go:561] object-"openshift-apiserver"/"audit-1": failed to list *v1.ConfigMap: configmaps "audit-1" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 01 09:11:31 crc kubenswrapper[4792]: E0301 09:11:31.719487 4792 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"audit-1\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"audit-1\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 01 09:11:31 crc kubenswrapper[4792]: W0301 09:11:31.719615 4792 reflector.go:561] object-"openshift-apiserver"/"etcd-client": failed to list *v1.Secret: secrets "etcd-client" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 01 09:11:31 crc kubenswrapper[4792]: E0301 09:11:31.719627 4792 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"etcd-client\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"etcd-client\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.720043 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.721991 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-tswcj"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.722408 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.732734 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-8qrq4"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.732846 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.723604 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.733344 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-nv4bp"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.733573 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-8qrq4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.723238 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.723661 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 01 09:11:31 crc kubenswrapper[4792]: W0301 09:11:31.723710 4792 reflector.go:561] object-"openshift-apiserver"/"config": failed to list *v1.ConfigMap: configmaps "config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 01 09:11:31 crc kubenswrapper[4792]: E0301 09:11:31.733719 4792 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.723810 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: W0301 09:11:31.723834 4792 reflector.go:561] object-"openshift-apiserver"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 01 09:11:31 crc kubenswrapper[4792]: E0301 09:11:31.733820 4792 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.723850 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 01 09:11:31 crc kubenswrapper[4792]: W0301 09:11:31.723858 4792 reflector.go:561] object-"openshift-apiserver"/"image-import-ca": failed to list *v1.ConfigMap: configmaps "image-import-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 01 09:11:31 crc kubenswrapper[4792]: E0301 09:11:31.733932 4792 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"image-import-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"image-import-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 01 09:11:31 crc kubenswrapper[4792]: W0301 09:11:31.723892 4792 reflector.go:561] object-"openshift-apiserver"/"trusted-ca-bundle": failed to list *v1.ConfigMap: configmaps "trusted-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 01 09:11:31 crc kubenswrapper[4792]: E0301 09:11:31.733953 4792 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"trusted-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"trusted-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.723896 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.723900 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: W0301 09:11:31.723980 4792 reflector.go:561] object-"openshift-apiserver"/"etcd-serving-ca": failed to list *v1.ConfigMap: configmaps "etcd-serving-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 01 09:11:31 crc kubenswrapper[4792]: E0301 09:11:31.734123 4792 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"etcd-serving-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"etcd-serving-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.723985 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.724001 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.724016 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.724021 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.724067 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.724067 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.724071 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.724111 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.724116 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.724138 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.724184 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.724195 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.724218 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.724327 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.724383 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.724414 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.724845 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.725001 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.725100 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.725205 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.725347 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.727825 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.728360 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.728462 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.732644 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.724221 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.737649 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.738013 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-prqqp"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.738227 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4jwnr"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.738457 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.738735 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.738868 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.738983 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.740150 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.740629 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-nv4bp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.744832 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-smnq2"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.745313 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-q92nw"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.745463 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.745585 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9smfd"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.746049 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.746150 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9smfd" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.746375 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.746598 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.746858 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.747078 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-smnq2" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.747111 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.747263 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.747390 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.747558 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.747790 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.748262 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.748436 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.748577 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.748924 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.749221 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.749461 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.753195 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.753339 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.753509 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.753989 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8qxf\" (UniqueName: \"kubernetes.io/projected/2311d615-fd4d-43c2-9fcb-8858383c2dc9-kube-api-access-v8qxf\") pod \"machine-approver-56656f9798-lk5qk\" (UID: \"2311d615-fd4d-43c2-9fcb-8858383c2dc9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754028 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/499393fc-abcf-4998-9e32-3d43a0b1e488-audit-dir\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754057 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-config\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754071 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2311d615-fd4d-43c2-9fcb-8858383c2dc9-machine-approver-tls\") pod \"machine-approver-56656f9798-lk5qk\" (UID: \"2311d615-fd4d-43c2-9fcb-8858383c2dc9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754093 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/86788093-42e5-4fa0-9595-97a910e6557e-console-oauth-config\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754108 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-service-ca\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754129 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-trusted-ca-bundle\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754144 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/499393fc-abcf-4998-9e32-3d43a0b1e488-encryption-config\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754159 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/86788093-42e5-4fa0-9595-97a910e6557e-console-serving-cert\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754174 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-etcd-serving-ca\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754188 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2311d615-fd4d-43c2-9fcb-8858383c2dc9-config\") pod \"machine-approver-56656f9798-lk5qk\" (UID: \"2311d615-fd4d-43c2-9fcb-8858383c2dc9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754203 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g22l6\" (UniqueName: \"kubernetes.io/projected/e12bfc30-3142-4073-96c7-a377ff6723f7-kube-api-access-g22l6\") pod \"openshift-apiserver-operator-796bbdcf4f-zk9bv\" (UID: \"e12bfc30-3142-4073-96c7-a377ff6723f7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zk9bv" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754220 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjp6q\" (UniqueName: \"kubernetes.io/projected/77e0e285-570c-47bd-854e-538c9367486b-kube-api-access-jjp6q\") pod \"cluster-samples-operator-665b6dd947-kq5kp\" (UID: \"77e0e285-570c-47bd-854e-538c9367486b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5kp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754236 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-console-config\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754251 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-trusted-ca-bundle\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754269 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e12bfc30-3142-4073-96c7-a377ff6723f7-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zk9bv\" (UID: \"e12bfc30-3142-4073-96c7-a377ff6723f7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zk9bv" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754284 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-serving-cert\") pod \"route-controller-manager-6576b87f9c-knb62\" (UID: \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754301 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw279\" (UniqueName: \"kubernetes.io/projected/499393fc-abcf-4998-9e32-3d43a0b1e488-kube-api-access-bw279\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754318 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62e02cd9-3008-41de-b7b7-dc1f546c5645-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-s68hg\" (UID: \"62e02cd9-3008-41de-b7b7-dc1f546c5645\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s68hg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754335 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/77e0e285-570c-47bd-854e-538c9367486b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kq5kp\" (UID: \"77e0e285-570c-47bd-854e-538c9367486b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5kp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754350 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-audit\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754365 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/499393fc-abcf-4998-9e32-3d43a0b1e488-etcd-client\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754379 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-client-ca\") pod \"route-controller-manager-6576b87f9c-knb62\" (UID: \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754394 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/499393fc-abcf-4998-9e32-3d43a0b1e488-node-pullsecrets\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754408 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-config\") pod \"route-controller-manager-6576b87f9c-knb62\" (UID: \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754423 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2311d615-fd4d-43c2-9fcb-8858383c2dc9-auth-proxy-config\") pod \"machine-approver-56656f9798-lk5qk\" (UID: \"2311d615-fd4d-43c2-9fcb-8858383c2dc9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754437 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnjdm\" (UniqueName: \"kubernetes.io/projected/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-kube-api-access-fnjdm\") pod \"route-controller-manager-6576b87f9c-knb62\" (UID: \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754454 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-image-import-ca\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754468 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/499393fc-abcf-4998-9e32-3d43a0b1e488-serving-cert\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754493 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wdc9\" (UniqueName: \"kubernetes.io/projected/6cc55bdf-6c0f-4d35-879f-c64c2dc4897c-kube-api-access-6wdc9\") pod \"downloads-7954f5f757-wxl8v\" (UID: \"6cc55bdf-6c0f-4d35-879f-c64c2dc4897c\") " pod="openshift-console/downloads-7954f5f757-wxl8v" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754510 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn8s2\" (UniqueName: \"kubernetes.io/projected/62e02cd9-3008-41de-b7b7-dc1f546c5645-kube-api-access-hn8s2\") pod \"openshift-controller-manager-operator-756b6f6bc6-s68hg\" (UID: \"62e02cd9-3008-41de-b7b7-dc1f546c5645\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s68hg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754527 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e12bfc30-3142-4073-96c7-a377ff6723f7-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zk9bv\" (UID: \"e12bfc30-3142-4073-96c7-a377ff6723f7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zk9bv" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754542 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-oauth-serving-cert\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754558 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qmx8\" (UniqueName: \"kubernetes.io/projected/86788093-42e5-4fa0-9595-97a910e6557e-kube-api-access-6qmx8\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754575 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62e02cd9-3008-41de-b7b7-dc1f546c5645-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-s68hg\" (UID: \"62e02cd9-3008-41de-b7b7-dc1f546c5645\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s68hg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.754661 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.755079 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.755448 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.755517 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.755759 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.755883 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.756124 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.756679 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.756809 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.756840 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.756950 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.756994 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.757007 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.758242 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-qtg4x"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.758855 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dbmff"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.759363 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9fc7"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.759890 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9fc7" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.762446 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.762555 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.763492 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.763548 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.763677 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.763958 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.763969 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dbmff" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.764181 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.764344 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.764613 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.764828 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.764991 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.765115 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.765236 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7b2lz"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.765257 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.765661 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-877gr"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.765698 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.773421 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.776214 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.776698 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.780329 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dwc4"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.784566 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-877gr" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.786133 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.807405 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.810191 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7b2lz" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.810245 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.810740 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.811742 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.812785 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.812833 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.819879 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dwc4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.836596 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nc4gs"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.837017 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.837322 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nc4gs" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.839431 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.841059 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.841556 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.842551 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-6rmwm"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.843045 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6rmwm" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.845135 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.847971 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.849126 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.850337 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.856641 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857161 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51683a24-edad-4808-b2ec-6a628bfdd937-config\") pod \"machine-api-operator-5694c8668f-nv4bp\" (UID: \"51683a24-edad-4808-b2ec-6a628bfdd937\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nv4bp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857205 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8578f8dc-143c-423c-b62b-b3190444bafd-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-wl9zt\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857225 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eeacfd31-08e1-49e6-afda-95efa2d815d2-config\") pod \"console-operator-58897d9998-8qrq4\" (UID: \"eeacfd31-08e1-49e6-afda-95efa2d815d2\") " pod="openshift-console-operator/console-operator-58897d9998-8qrq4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857240 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qx6z\" (UniqueName: \"kubernetes.io/projected/eeacfd31-08e1-49e6-afda-95efa2d815d2-kube-api-access-8qx6z\") pod \"console-operator-58897d9998-8qrq4\" (UID: \"eeacfd31-08e1-49e6-afda-95efa2d815d2\") " pod="openshift-console-operator/console-operator-58897d9998-8qrq4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857267 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-config\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857287 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2311d615-fd4d-43c2-9fcb-8858383c2dc9-machine-approver-tls\") pod \"machine-approver-56656f9798-lk5qk\" (UID: \"2311d615-fd4d-43c2-9fcb-8858383c2dc9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857302 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn288\" (UniqueName: \"kubernetes.io/projected/51683a24-edad-4808-b2ec-6a628bfdd937-kube-api-access-bn288\") pod \"machine-api-operator-5694c8668f-nv4bp\" (UID: \"51683a24-edad-4808-b2ec-6a628bfdd937\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nv4bp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857319 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9d95b2fd-64be-4688-a596-c41bb31cb9c4-audit-policies\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857338 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857354 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857371 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f432b26-7417-4b71-a63a-5cb9a142bd43-config\") pod \"authentication-operator-69f744f599-tswcj\" (UID: \"1f432b26-7417-4b71-a63a-5cb9a142bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857395 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/86788093-42e5-4fa0-9595-97a910e6557e-console-oauth-config\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857411 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-service-ca\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857427 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/020a8218-62f4-4abf-a8d2-fed602de5f7f-audit-dir\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857450 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-trusted-ca-bundle\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857465 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/499393fc-abcf-4998-9e32-3d43a0b1e488-encryption-config\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857483 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857500 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47d3ee47-7c75-4321-8b9c-5e119a92a311-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7b2lz\" (UID: \"47d3ee47-7c75-4321-8b9c-5e119a92a311\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7b2lz" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857517 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/86788093-42e5-4fa0-9595-97a910e6557e-console-serving-cert\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857532 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-etcd-serving-ca\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857551 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2311d615-fd4d-43c2-9fcb-8858383c2dc9-config\") pod \"machine-approver-56656f9798-lk5qk\" (UID: \"2311d615-fd4d-43c2-9fcb-8858383c2dc9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857569 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d0571e3-5089-4157-a36a-25ecfe6a67f2-service-ca-bundle\") pod \"router-default-5444994796-qtg4x\" (UID: \"6d0571e3-5089-4157-a36a-25ecfe6a67f2\") " pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857588 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkbg8\" (UniqueName: \"kubernetes.io/projected/0a5ad85c-19b5-432d-aa36-d0db74e44744-kube-api-access-kkbg8\") pod \"machine-config-controller-84d6567774-877gr\" (UID: \"0a5ad85c-19b5-432d-aa36-d0db74e44744\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-877gr" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857607 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g22l6\" (UniqueName: \"kubernetes.io/projected/e12bfc30-3142-4073-96c7-a377ff6723f7-kube-api-access-g22l6\") pod \"openshift-apiserver-operator-796bbdcf4f-zk9bv\" (UID: \"e12bfc30-3142-4073-96c7-a377ff6723f7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zk9bv" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857625 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjp6q\" (UniqueName: \"kubernetes.io/projected/77e0e285-570c-47bd-854e-538c9367486b-kube-api-access-jjp6q\") pod \"cluster-samples-operator-665b6dd947-kq5kp\" (UID: \"77e0e285-570c-47bd-854e-538c9367486b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5kp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857644 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/51683a24-edad-4808-b2ec-6a628bfdd937-images\") pod \"machine-api-operator-5694c8668f-nv4bp\" (UID: \"51683a24-edad-4808-b2ec-6a628bfdd937\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nv4bp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857662 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f432b26-7417-4b71-a63a-5cb9a142bd43-serving-cert\") pod \"authentication-operator-69f744f599-tswcj\" (UID: \"1f432b26-7417-4b71-a63a-5cb9a142bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857700 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-dbmff\" (UID: \"d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dbmff" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857718 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857736 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6d0571e3-5089-4157-a36a-25ecfe6a67f2-stats-auth\") pod \"router-default-5444994796-qtg4x\" (UID: \"6d0571e3-5089-4157-a36a-25ecfe6a67f2\") " pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857756 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6d6h\" (UniqueName: \"kubernetes.io/projected/47d3ee47-7c75-4321-8b9c-5e119a92a311-kube-api-access-f6d6h\") pod \"kube-storage-version-migrator-operator-b67b599dd-7b2lz\" (UID: \"47d3ee47-7c75-4321-8b9c-5e119a92a311\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7b2lz" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857779 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/51683a24-edad-4808-b2ec-6a628bfdd937-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-nv4bp\" (UID: \"51683a24-edad-4808-b2ec-6a628bfdd937\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nv4bp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857803 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ee9e9d4-e788-41cb-b601-035551b5338c-serving-cert\") pod \"openshift-config-operator-7777fb866f-2l2w7\" (UID: \"9ee9e9d4-e788-41cb-b601-035551b5338c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857823 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857840 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f432b26-7417-4b71-a63a-5cb9a142bd43-service-ca-bundle\") pod \"authentication-operator-69f744f599-tswcj\" (UID: \"1f432b26-7417-4b71-a63a-5cb9a142bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857854 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d0571e3-5089-4157-a36a-25ecfe6a67f2-metrics-certs\") pod \"router-default-5444994796-qtg4x\" (UID: \"6d0571e3-5089-4157-a36a-25ecfe6a67f2\") " pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857873 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-console-config\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857892 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-trusted-ca-bundle\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857948 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e12bfc30-3142-4073-96c7-a377ff6723f7-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zk9bv\" (UID: \"e12bfc30-3142-4073-96c7-a377ff6723f7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zk9bv" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857966 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-serving-cert\") pod \"route-controller-manager-6576b87f9c-knb62\" (UID: \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.857983 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858000 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8578f8dc-143c-423c-b62b-b3190444bafd-config\") pod \"controller-manager-879f6c89f-wl9zt\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858022 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvxvc\" (UniqueName: \"kubernetes.io/projected/6d0571e3-5089-4157-a36a-25ecfe6a67f2-kube-api-access-bvxvc\") pod \"router-default-5444994796-qtg4x\" (UID: \"6d0571e3-5089-4157-a36a-25ecfe6a67f2\") " pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858040 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw279\" (UniqueName: \"kubernetes.io/projected/499393fc-abcf-4998-9e32-3d43a0b1e488-kube-api-access-bw279\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858057 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62e02cd9-3008-41de-b7b7-dc1f546c5645-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-s68hg\" (UID: \"62e02cd9-3008-41de-b7b7-dc1f546c5645\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s68hg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858073 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842-config\") pod \"kube-apiserver-operator-766d6c64bb-dbmff\" (UID: \"d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dbmff" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858091 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eeacfd31-08e1-49e6-afda-95efa2d815d2-trusted-ca\") pod \"console-operator-58897d9998-8qrq4\" (UID: \"eeacfd31-08e1-49e6-afda-95efa2d815d2\") " pod="openshift-console-operator/console-operator-58897d9998-8qrq4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858107 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9d95b2fd-64be-4688-a596-c41bb31cb9c4-etcd-client\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858124 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d95b2fd-64be-4688-a596-c41bb31cb9c4-serving-cert\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858139 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858154 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47d3ee47-7c75-4321-8b9c-5e119a92a311-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7b2lz\" (UID: \"47d3ee47-7c75-4321-8b9c-5e119a92a311\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7b2lz" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858172 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/77e0e285-570c-47bd-854e-538c9367486b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kq5kp\" (UID: \"77e0e285-570c-47bd-854e-538c9367486b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5kp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858187 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-audit\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858203 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/499393fc-abcf-4998-9e32-3d43a0b1e488-etcd-client\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858219 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-client-ca\") pod \"route-controller-manager-6576b87f9c-knb62\" (UID: \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858237 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/499393fc-abcf-4998-9e32-3d43a0b1e488-node-pullsecrets\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858252 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-config\") pod \"route-controller-manager-6576b87f9c-knb62\" (UID: \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858268 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a5ad85c-19b5-432d-aa36-d0db74e44744-proxy-tls\") pod \"machine-config-controller-84d6567774-877gr\" (UID: \"0a5ad85c-19b5-432d-aa36-d0db74e44744\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-877gr" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858284 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e0b63d94-59de-45da-8058-89714bea7a90-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-9smfd\" (UID: \"e0b63d94-59de-45da-8058-89714bea7a90\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9smfd" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858301 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b4xc\" (UniqueName: \"kubernetes.io/projected/1f432b26-7417-4b71-a63a-5cb9a142bd43-kube-api-access-7b4xc\") pod \"authentication-operator-69f744f599-tswcj\" (UID: \"1f432b26-7417-4b71-a63a-5cb9a142bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858318 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2311d615-fd4d-43c2-9fcb-8858383c2dc9-auth-proxy-config\") pod \"machine-approver-56656f9798-lk5qk\" (UID: \"2311d615-fd4d-43c2-9fcb-8858383c2dc9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858336 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnjdm\" (UniqueName: \"kubernetes.io/projected/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-kube-api-access-fnjdm\") pod \"route-controller-manager-6576b87f9c-knb62\" (UID: \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858354 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d95b2fd-64be-4688-a596-c41bb31cb9c4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858369 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9d95b2fd-64be-4688-a596-c41bb31cb9c4-encryption-config\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858387 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9ee9e9d4-e788-41cb-b601-035551b5338c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2l2w7\" (UID: \"9ee9e9d4-e788-41cb-b601-035551b5338c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858402 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858428 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-image-import-ca\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858443 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/499393fc-abcf-4998-9e32-3d43a0b1e488-serving-cert\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858460 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-dbmff\" (UID: \"d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dbmff" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858477 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpcg5\" (UniqueName: \"kubernetes.io/projected/9ee9e9d4-e788-41cb-b601-035551b5338c-kube-api-access-lpcg5\") pod \"openshift-config-operator-7777fb866f-2l2w7\" (UID: \"9ee9e9d4-e788-41cb-b601-035551b5338c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858499 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9d95b2fd-64be-4688-a596-c41bb31cb9c4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858514 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858532 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wdc9\" (UniqueName: \"kubernetes.io/projected/6cc55bdf-6c0f-4d35-879f-c64c2dc4897c-kube-api-access-6wdc9\") pod \"downloads-7954f5f757-wxl8v\" (UID: \"6cc55bdf-6c0f-4d35-879f-c64c2dc4897c\") " pod="openshift-console/downloads-7954f5f757-wxl8v" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858551 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn8s2\" (UniqueName: \"kubernetes.io/projected/62e02cd9-3008-41de-b7b7-dc1f546c5645-kube-api-access-hn8s2\") pod \"openshift-controller-manager-operator-756b6f6bc6-s68hg\" (UID: \"62e02cd9-3008-41de-b7b7-dc1f546c5645\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s68hg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858567 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e12bfc30-3142-4073-96c7-a377ff6723f7-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zk9bv\" (UID: \"e12bfc30-3142-4073-96c7-a377ff6723f7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zk9bv" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858582 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-oauth-serving-cert\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858598 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qmx8\" (UniqueName: \"kubernetes.io/projected/86788093-42e5-4fa0-9595-97a910e6557e-kube-api-access-6qmx8\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858614 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62e02cd9-3008-41de-b7b7-dc1f546c5645-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-s68hg\" (UID: \"62e02cd9-3008-41de-b7b7-dc1f546c5645\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s68hg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858631 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9d95b2fd-64be-4688-a596-c41bb31cb9c4-audit-dir\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858647 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-audit-policies\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858665 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55wpx\" (UniqueName: \"kubernetes.io/projected/020a8218-62f4-4abf-a8d2-fed602de5f7f-kube-api-access-55wpx\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858681 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eeacfd31-08e1-49e6-afda-95efa2d815d2-serving-cert\") pod \"console-operator-58897d9998-8qrq4\" (UID: \"eeacfd31-08e1-49e6-afda-95efa2d815d2\") " pod="openshift-console-operator/console-operator-58897d9998-8qrq4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858697 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8qxf\" (UniqueName: \"kubernetes.io/projected/2311d615-fd4d-43c2-9fcb-8858383c2dc9-kube-api-access-v8qxf\") pod \"machine-approver-56656f9798-lk5qk\" (UID: \"2311d615-fd4d-43c2-9fcb-8858383c2dc9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858715 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spbvd\" (UniqueName: \"kubernetes.io/projected/e0b63d94-59de-45da-8058-89714bea7a90-kube-api-access-spbvd\") pod \"control-plane-machine-set-operator-78cbb6b69f-9smfd\" (UID: \"e0b63d94-59de-45da-8058-89714bea7a90\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9smfd" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858731 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858745 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f432b26-7417-4b71-a63a-5cb9a142bd43-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-tswcj\" (UID: \"1f432b26-7417-4b71-a63a-5cb9a142bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858768 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8578f8dc-143c-423c-b62b-b3190444bafd-client-ca\") pod \"controller-manager-879f6c89f-wl9zt\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858783 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0a5ad85c-19b5-432d-aa36-d0db74e44744-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-877gr\" (UID: \"0a5ad85c-19b5-432d-aa36-d0db74e44744\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-877gr" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858802 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/499393fc-abcf-4998-9e32-3d43a0b1e488-audit-dir\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858819 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8578f8dc-143c-423c-b62b-b3190444bafd-serving-cert\") pod \"controller-manager-879f6c89f-wl9zt\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858836 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qq5x\" (UniqueName: \"kubernetes.io/projected/8578f8dc-143c-423c-b62b-b3190444bafd-kube-api-access-6qq5x\") pod \"controller-manager-879f6c89f-wl9zt\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858851 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xslxf\" (UniqueName: \"kubernetes.io/projected/9d95b2fd-64be-4688-a596-c41bb31cb9c4-kube-api-access-xslxf\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858867 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.858882 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6d0571e3-5089-4157-a36a-25ecfe6a67f2-default-certificate\") pod \"router-default-5444994796-qtg4x\" (UID: \"6d0571e3-5089-4157-a36a-25ecfe6a67f2\") " pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.861242 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-service-ca\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.864648 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-zrzcg"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.864701 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-wxl8v"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.864711 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5kp"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.869586 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-client-ca\") pod \"route-controller-manager-6576b87f9c-knb62\" (UID: \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.869876 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/499393fc-abcf-4998-9e32-3d43a0b1e488-node-pullsecrets\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.871287 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-config\") pod \"route-controller-manager-6576b87f9c-knb62\" (UID: \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.872160 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2311d615-fd4d-43c2-9fcb-8858383c2dc9-auth-proxy-config\") pod \"machine-approver-56656f9798-lk5qk\" (UID: \"2311d615-fd4d-43c2-9fcb-8858383c2dc9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.873168 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.875851 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-console-config\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.877223 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2311d615-fd4d-43c2-9fcb-8858383c2dc9-config\") pod \"machine-approver-56656f9798-lk5qk\" (UID: \"2311d615-fd4d-43c2-9fcb-8858383c2dc9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.877350 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/499393fc-abcf-4998-9e32-3d43a0b1e488-audit-dir\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.877875 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e12bfc30-3142-4073-96c7-a377ff6723f7-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zk9bv\" (UID: \"e12bfc30-3142-4073-96c7-a377ff6723f7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zk9bv" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.878356 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-oauth-serving-cert\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.878357 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/499393fc-abcf-4998-9e32-3d43a0b1e488-encryption-config\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.878781 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62e02cd9-3008-41de-b7b7-dc1f546c5645-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-s68hg\" (UID: \"62e02cd9-3008-41de-b7b7-dc1f546c5645\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s68hg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.886953 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e12bfc30-3142-4073-96c7-a377ff6723f7-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zk9bv\" (UID: \"e12bfc30-3142-4073-96c7-a377ff6723f7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zk9bv" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.887483 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2311d615-fd4d-43c2-9fcb-8858383c2dc9-machine-approver-tls\") pod \"machine-approver-56656f9798-lk5qk\" (UID: \"2311d615-fd4d-43c2-9fcb-8858383c2dc9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.888003 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/86788093-42e5-4fa0-9595-97a910e6557e-console-oauth-config\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.888987 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/86788093-42e5-4fa0-9595-97a910e6557e-console-serving-cert\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.889063 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.889630 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.889875 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539270-q7hck"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.890298 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539270-q7hck" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.890755 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-trusted-ca-bundle\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.890876 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-serving-cert\") pod \"route-controller-manager-6576b87f9c-knb62\" (UID: \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.893375 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/77e0e285-570c-47bd-854e-538c9367486b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kq5kp\" (UID: \"77e0e285-570c-47bd-854e-538c9367486b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5kp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.897314 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s68hg"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.897381 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.897843 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.899666 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.900358 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.900518 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62e02cd9-3008-41de-b7b7-dc1f546c5645-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-s68hg\" (UID: \"62e02cd9-3008-41de-b7b7-dc1f546c5645\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s68hg" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.921543 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.927406 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.939868 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-prqqp"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.939949 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wl9zt"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.939971 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.941066 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.941428 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.948543 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.950672 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-nv4bp"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.960873 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvxvc\" (UniqueName: \"kubernetes.io/projected/6d0571e3-5089-4157-a36a-25ecfe6a67f2-kube-api-access-bvxvc\") pod \"router-default-5444994796-qtg4x\" (UID: \"6d0571e3-5089-4157-a36a-25ecfe6a67f2\") " pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.960963 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8578f8dc-143c-423c-b62b-b3190444bafd-config\") pod \"controller-manager-879f6c89f-wl9zt\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961018 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842-config\") pod \"kube-apiserver-operator-766d6c64bb-dbmff\" (UID: \"d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dbmff" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961061 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47d3ee47-7c75-4321-8b9c-5e119a92a311-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7b2lz\" (UID: \"47d3ee47-7c75-4321-8b9c-5e119a92a311\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7b2lz" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961097 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eeacfd31-08e1-49e6-afda-95efa2d815d2-trusted-ca\") pod \"console-operator-58897d9998-8qrq4\" (UID: \"eeacfd31-08e1-49e6-afda-95efa2d815d2\") " pod="openshift-console-operator/console-operator-58897d9998-8qrq4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961154 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9d95b2fd-64be-4688-a596-c41bb31cb9c4-etcd-client\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961185 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d95b2fd-64be-4688-a596-c41bb31cb9c4-serving-cert\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961244 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961314 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e0b63d94-59de-45da-8058-89714bea7a90-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-9smfd\" (UID: \"e0b63d94-59de-45da-8058-89714bea7a90\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9smfd" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961348 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b4xc\" (UniqueName: \"kubernetes.io/projected/1f432b26-7417-4b71-a63a-5cb9a142bd43-kube-api-access-7b4xc\") pod \"authentication-operator-69f744f599-tswcj\" (UID: \"1f432b26-7417-4b71-a63a-5cb9a142bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961386 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a5ad85c-19b5-432d-aa36-d0db74e44744-proxy-tls\") pod \"machine-config-controller-84d6567774-877gr\" (UID: \"0a5ad85c-19b5-432d-aa36-d0db74e44744\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-877gr" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961418 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961478 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d95b2fd-64be-4688-a596-c41bb31cb9c4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961523 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9d95b2fd-64be-4688-a596-c41bb31cb9c4-encryption-config\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961557 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9ee9e9d4-e788-41cb-b601-035551b5338c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2l2w7\" (UID: \"9ee9e9d4-e788-41cb-b601-035551b5338c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961606 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-dbmff\" (UID: \"d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dbmff" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961648 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpcg5\" (UniqueName: \"kubernetes.io/projected/9ee9e9d4-e788-41cb-b601-035551b5338c-kube-api-access-lpcg5\") pod \"openshift-config-operator-7777fb866f-2l2w7\" (UID: \"9ee9e9d4-e788-41cb-b601-035551b5338c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961702 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9d95b2fd-64be-4688-a596-c41bb31cb9c4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961767 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961850 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eeacfd31-08e1-49e6-afda-95efa2d815d2-serving-cert\") pod \"console-operator-58897d9998-8qrq4\" (UID: \"eeacfd31-08e1-49e6-afda-95efa2d815d2\") " pod="openshift-console-operator/console-operator-58897d9998-8qrq4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961923 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9d95b2fd-64be-4688-a596-c41bb31cb9c4-audit-dir\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961959 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-audit-policies\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.961989 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55wpx\" (UniqueName: \"kubernetes.io/projected/020a8218-62f4-4abf-a8d2-fed602de5f7f-kube-api-access-55wpx\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962080 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spbvd\" (UniqueName: \"kubernetes.io/projected/e0b63d94-59de-45da-8058-89714bea7a90-kube-api-access-spbvd\") pod \"control-plane-machine-set-operator-78cbb6b69f-9smfd\" (UID: \"e0b63d94-59de-45da-8058-89714bea7a90\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9smfd" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962127 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962198 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f432b26-7417-4b71-a63a-5cb9a142bd43-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-tswcj\" (UID: \"1f432b26-7417-4b71-a63a-5cb9a142bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962248 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8578f8dc-143c-423c-b62b-b3190444bafd-client-ca\") pod \"controller-manager-879f6c89f-wl9zt\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962276 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0a5ad85c-19b5-432d-aa36-d0db74e44744-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-877gr\" (UID: \"0a5ad85c-19b5-432d-aa36-d0db74e44744\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-877gr" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962308 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qq5x\" (UniqueName: \"kubernetes.io/projected/8578f8dc-143c-423c-b62b-b3190444bafd-kube-api-access-6qq5x\") pod \"controller-manager-879f6c89f-wl9zt\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962365 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xslxf\" (UniqueName: \"kubernetes.io/projected/9d95b2fd-64be-4688-a596-c41bb31cb9c4-kube-api-access-xslxf\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962376 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8578f8dc-143c-423c-b62b-b3190444bafd-config\") pod \"controller-manager-879f6c89f-wl9zt\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962400 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8578f8dc-143c-423c-b62b-b3190444bafd-serving-cert\") pod \"controller-manager-879f6c89f-wl9zt\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962427 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962475 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6d0571e3-5089-4157-a36a-25ecfe6a67f2-default-certificate\") pod \"router-default-5444994796-qtg4x\" (UID: \"6d0571e3-5089-4157-a36a-25ecfe6a67f2\") " pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962532 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51683a24-edad-4808-b2ec-6a628bfdd937-config\") pod \"machine-api-operator-5694c8668f-nv4bp\" (UID: \"51683a24-edad-4808-b2ec-6a628bfdd937\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nv4bp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962602 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8578f8dc-143c-423c-b62b-b3190444bafd-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-wl9zt\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962656 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eeacfd31-08e1-49e6-afda-95efa2d815d2-config\") pod \"console-operator-58897d9998-8qrq4\" (UID: \"eeacfd31-08e1-49e6-afda-95efa2d815d2\") " pod="openshift-console-operator/console-operator-58897d9998-8qrq4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962714 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qx6z\" (UniqueName: \"kubernetes.io/projected/eeacfd31-08e1-49e6-afda-95efa2d815d2-kube-api-access-8qx6z\") pod \"console-operator-58897d9998-8qrq4\" (UID: \"eeacfd31-08e1-49e6-afda-95efa2d815d2\") " pod="openshift-console-operator/console-operator-58897d9998-8qrq4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962747 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962768 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9d95b2fd-64be-4688-a596-c41bb31cb9c4-audit-policies\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962814 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962862 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn288\" (UniqueName: \"kubernetes.io/projected/51683a24-edad-4808-b2ec-6a628bfdd937-kube-api-access-bn288\") pod \"machine-api-operator-5694c8668f-nv4bp\" (UID: \"51683a24-edad-4808-b2ec-6a628bfdd937\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nv4bp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962889 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.962959 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f432b26-7417-4b71-a63a-5cb9a142bd43-config\") pod \"authentication-operator-69f744f599-tswcj\" (UID: \"1f432b26-7417-4b71-a63a-5cb9a142bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.963020 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/020a8218-62f4-4abf-a8d2-fed602de5f7f-audit-dir\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.963074 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.963115 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47d3ee47-7c75-4321-8b9c-5e119a92a311-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7b2lz\" (UID: \"47d3ee47-7c75-4321-8b9c-5e119a92a311\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7b2lz" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.963159 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d0571e3-5089-4157-a36a-25ecfe6a67f2-service-ca-bundle\") pod \"router-default-5444994796-qtg4x\" (UID: \"6d0571e3-5089-4157-a36a-25ecfe6a67f2\") " pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.963202 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkbg8\" (UniqueName: \"kubernetes.io/projected/0a5ad85c-19b5-432d-aa36-d0db74e44744-kube-api-access-kkbg8\") pod \"machine-config-controller-84d6567774-877gr\" (UID: \"0a5ad85c-19b5-432d-aa36-d0db74e44744\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-877gr" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.963261 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/51683a24-edad-4808-b2ec-6a628bfdd937-images\") pod \"machine-api-operator-5694c8668f-nv4bp\" (UID: \"51683a24-edad-4808-b2ec-6a628bfdd937\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nv4bp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.963292 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f432b26-7417-4b71-a63a-5cb9a142bd43-serving-cert\") pod \"authentication-operator-69f744f599-tswcj\" (UID: \"1f432b26-7417-4b71-a63a-5cb9a142bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.963319 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-dbmff\" (UID: \"d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dbmff" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.963351 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.963384 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6d0571e3-5089-4157-a36a-25ecfe6a67f2-stats-auth\") pod \"router-default-5444994796-qtg4x\" (UID: \"6d0571e3-5089-4157-a36a-25ecfe6a67f2\") " pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.963447 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6d6h\" (UniqueName: \"kubernetes.io/projected/47d3ee47-7c75-4321-8b9c-5e119a92a311-kube-api-access-f6d6h\") pod \"kube-storage-version-migrator-operator-b67b599dd-7b2lz\" (UID: \"47d3ee47-7c75-4321-8b9c-5e119a92a311\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7b2lz" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.963475 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d0571e3-5089-4157-a36a-25ecfe6a67f2-metrics-certs\") pod \"router-default-5444994796-qtg4x\" (UID: \"6d0571e3-5089-4157-a36a-25ecfe6a67f2\") " pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.963543 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/51683a24-edad-4808-b2ec-6a628bfdd937-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-nv4bp\" (UID: \"51683a24-edad-4808-b2ec-6a628bfdd937\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nv4bp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.963579 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ee9e9d4-e788-41cb-b601-035551b5338c-serving-cert\") pod \"openshift-config-operator-7777fb866f-2l2w7\" (UID: \"9ee9e9d4-e788-41cb-b601-035551b5338c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.963616 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.963645 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f432b26-7417-4b71-a63a-5cb9a142bd43-service-ca-bundle\") pod \"authentication-operator-69f744f599-tswcj\" (UID: \"1f432b26-7417-4b71-a63a-5cb9a142bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.963693 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.964785 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-audit-policies\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.966477 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eeacfd31-08e1-49e6-afda-95efa2d815d2-trusted-ca\") pod \"console-operator-58897d9998-8qrq4\" (UID: \"eeacfd31-08e1-49e6-afda-95efa2d815d2\") " pod="openshift-console-operator/console-operator-58897d9998-8qrq4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.969071 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f432b26-7417-4b71-a63a-5cb9a142bd43-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-tswcj\" (UID: \"1f432b26-7417-4b71-a63a-5cb9a142bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.970399 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.971183 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0a5ad85c-19b5-432d-aa36-d0db74e44744-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-877gr\" (UID: \"0a5ad85c-19b5-432d-aa36-d0db74e44744\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-877gr" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.972484 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8578f8dc-143c-423c-b62b-b3190444bafd-client-ca\") pod \"controller-manager-879f6c89f-wl9zt\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.972499 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9ee9e9d4-e788-41cb-b601-035551b5338c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2l2w7\" (UID: \"9ee9e9d4-e788-41cb-b601-035551b5338c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.976268 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9d95b2fd-64be-4688-a596-c41bb31cb9c4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.976358 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9d95b2fd-64be-4688-a596-c41bb31cb9c4-audit-dir\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.987857 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d95b2fd-64be-4688-a596-c41bb31cb9c4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.992861 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9d95b2fd-64be-4688-a596-c41bb31cb9c4-encryption-config\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.993023 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/51683a24-edad-4808-b2ec-6a628bfdd937-images\") pod \"machine-api-operator-5694c8668f-nv4bp\" (UID: \"51683a24-edad-4808-b2ec-6a628bfdd937\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nv4bp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.993715 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.993915 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.994040 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e0b63d94-59de-45da-8058-89714bea7a90-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-9smfd\" (UID: \"e0b63d94-59de-45da-8058-89714bea7a90\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9smfd" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.994247 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.994636 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eeacfd31-08e1-49e6-afda-95efa2d815d2-serving-cert\") pod \"console-operator-58897d9998-8qrq4\" (UID: \"eeacfd31-08e1-49e6-afda-95efa2d815d2\") " pod="openshift-console-operator/console-operator-58897d9998-8qrq4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.994815 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9d95b2fd-64be-4688-a596-c41bb31cb9c4-audit-policies\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.995049 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.998721 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zk9bv"] Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.999109 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:31 crc kubenswrapper[4792]: I0301 09:11:31.999519 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f432b26-7417-4b71-a63a-5cb9a142bd43-serving-cert\") pod \"authentication-operator-69f744f599-tswcj\" (UID: \"1f432b26-7417-4b71-a63a-5cb9a142bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.000585 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-kxx8s"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.001359 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kxx8s" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.002892 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8578f8dc-143c-423c-b62b-b3190444bafd-serving-cert\") pod \"controller-manager-879f6c89f-wl9zt\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.003252 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51683a24-edad-4808-b2ec-6a628bfdd937-config\") pod \"machine-api-operator-5694c8668f-nv4bp\" (UID: \"51683a24-edad-4808-b2ec-6a628bfdd937\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nv4bp" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.003322 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f432b26-7417-4b71-a63a-5cb9a142bd43-service-ca-bundle\") pod \"authentication-operator-69f744f599-tswcj\" (UID: \"1f432b26-7417-4b71-a63a-5cb9a142bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.003385 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d95b2fd-64be-4688-a596-c41bb31cb9c4-serving-cert\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.003715 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.004161 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9d95b2fd-64be-4688-a596-c41bb31cb9c4-etcd-client\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.004503 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8578f8dc-143c-423c-b62b-b3190444bafd-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-wl9zt\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.004765 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eeacfd31-08e1-49e6-afda-95efa2d815d2-config\") pod \"console-operator-58897d9998-8qrq4\" (UID: \"eeacfd31-08e1-49e6-afda-95efa2d815d2\") " pod="openshift-console-operator/console-operator-58897d9998-8qrq4" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.004975 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/51683a24-edad-4808-b2ec-6a628bfdd937-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-nv4bp\" (UID: \"51683a24-edad-4808-b2ec-6a628bfdd937\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nv4bp" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.005100 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/020a8218-62f4-4abf-a8d2-fed602de5f7f-audit-dir\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.005526 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.005801 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f432b26-7417-4b71-a63a-5cb9a142bd43-config\") pod \"authentication-operator-69f744f599-tswcj\" (UID: \"1f432b26-7417-4b71-a63a-5cb9a142bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.006186 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.006456 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ee9e9d4-e788-41cb-b601-035551b5338c-serving-cert\") pod \"openshift-config-operator-7777fb866f-2l2w7\" (UID: \"9ee9e9d4-e788-41cb-b601-035551b5338c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.008549 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gk6c6"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.011545 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.012266 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-r7d4f"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.012677 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-rjwhk"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.017832 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.018175 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.019083 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.019248 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rjwhk" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.019308 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-r7d4f" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.019558 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-wpgg2"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.020734 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.020758 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-tswcj"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.020819 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-wpgg2" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.020877 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.022448 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.027509 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-6lk5b"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.030763 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-q92nw"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.032121 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4jwnr"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.038208 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.038715 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7b2lz"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.040522 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dwc4"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.041174 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-smnq2"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.042988 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rjwhk"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.043254 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.044542 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539270-q7hck"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.044767 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.046506 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.046533 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.047323 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-64dsw"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.049753 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.051475 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-glj9p"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.052395 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-glj9p" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.052949 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nc4gs"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.057692 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9fc7"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.058447 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dbmff"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.059840 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9smfd"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.060824 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.061323 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.062574 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-64dsw"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.072420 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-877gr"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.075062 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-8qrq4"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.076775 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-r7d4f"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.078620 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.082009 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.082284 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-6rmwm"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.084470 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.085015 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gk6c6"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.086145 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.087066 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-dgh8q"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.088106 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dgh8q" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.088379 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.089614 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dgh8q"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.090961 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.093540 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-wpgg2"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.094743 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-kxx8s"] Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.103940 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.121797 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.142705 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.182214 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.202252 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.214184 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6d0571e3-5089-4157-a36a-25ecfe6a67f2-stats-auth\") pod \"router-default-5444994796-qtg4x\" (UID: \"6d0571e3-5089-4157-a36a-25ecfe6a67f2\") " pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.221833 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.232435 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d0571e3-5089-4157-a36a-25ecfe6a67f2-metrics-certs\") pod \"router-default-5444994796-qtg4x\" (UID: \"6d0571e3-5089-4157-a36a-25ecfe6a67f2\") " pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.242291 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.253164 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d0571e3-5089-4157-a36a-25ecfe6a67f2-service-ca-bundle\") pod \"router-default-5444994796-qtg4x\" (UID: \"6d0571e3-5089-4157-a36a-25ecfe6a67f2\") " pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.263552 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.268749 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6d0571e3-5089-4157-a36a-25ecfe6a67f2-default-certificate\") pod \"router-default-5444994796-qtg4x\" (UID: \"6d0571e3-5089-4157-a36a-25ecfe6a67f2\") " pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.281408 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.286663 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-dbmff\" (UID: \"d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dbmff" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.303384 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.326432 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.335180 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842-config\") pod \"kube-apiserver-operator-766d6c64bb-dbmff\" (UID: \"d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dbmff" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.341380 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.362189 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.381249 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.400732 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.421341 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.442618 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.457443 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a5ad85c-19b5-432d-aa36-d0db74e44744-proxy-tls\") pod \"machine-config-controller-84d6567774-877gr\" (UID: \"0a5ad85c-19b5-432d-aa36-d0db74e44744\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-877gr" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.462309 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.481612 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.502273 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.520856 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.542332 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.561719 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.567993 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47d3ee47-7c75-4321-8b9c-5e119a92a311-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7b2lz\" (UID: \"47d3ee47-7c75-4321-8b9c-5e119a92a311\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7b2lz" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.581219 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.601408 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.622240 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.630347 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47d3ee47-7c75-4321-8b9c-5e119a92a311-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7b2lz\" (UID: \"47d3ee47-7c75-4321-8b9c-5e119a92a311\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7b2lz" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.662666 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.682209 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.701888 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.722879 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.741822 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.761878 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.782566 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.801590 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.822411 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.841894 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 01 09:11:32 crc kubenswrapper[4792]: E0301 09:11:32.859589 4792 configmap.go:193] Couldn't get configMap openshift-apiserver/config: failed to sync configmap cache: timed out waiting for the condition Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.859630 4792 request.go:700] Waited for 1.007117324s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/configmaps?fieldSelector=metadata.name%3Dmachine-config-operator-images&limit=500&resourceVersion=0 Mar 01 09:11:32 crc kubenswrapper[4792]: E0301 09:11:32.859667 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-config podName:499393fc-abcf-4998-9e32-3d43a0b1e488 nodeName:}" failed. No retries permitted until 2026-03-01 09:11:33.359646186 +0000 UTC m=+222.601525393 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-config") pod "apiserver-76f77b778f-6lk5b" (UID: "499393fc-abcf-4998-9e32-3d43a0b1e488") : failed to sync configmap cache: timed out waiting for the condition Mar 01 09:11:32 crc kubenswrapper[4792]: E0301 09:11:32.861620 4792 configmap.go:193] Couldn't get configMap openshift-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 01 09:11:32 crc kubenswrapper[4792]: E0301 09:11:32.861670 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-trusted-ca-bundle podName:499393fc-abcf-4998-9e32-3d43a0b1e488 nodeName:}" failed. No retries permitted until 2026-03-01 09:11:33.361657125 +0000 UTC m=+222.603536332 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-trusted-ca-bundle") pod "apiserver-76f77b778f-6lk5b" (UID: "499393fc-abcf-4998-9e32-3d43a0b1e488") : failed to sync configmap cache: timed out waiting for the condition Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.861798 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 01 09:11:32 crc kubenswrapper[4792]: E0301 09:11:32.868413 4792 secret.go:188] Couldn't get secret openshift-apiserver/etcd-client: failed to sync secret cache: timed out waiting for the condition Mar 01 09:11:32 crc kubenswrapper[4792]: E0301 09:11:32.868472 4792 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-1: failed to sync configmap cache: timed out waiting for the condition Mar 01 09:11:32 crc kubenswrapper[4792]: E0301 09:11:32.868558 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/499393fc-abcf-4998-9e32-3d43a0b1e488-etcd-client podName:499393fc-abcf-4998-9e32-3d43a0b1e488 nodeName:}" failed. No retries permitted until 2026-03-01 09:11:33.368504384 +0000 UTC m=+222.610383661 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/499393fc-abcf-4998-9e32-3d43a0b1e488-etcd-client") pod "apiserver-76f77b778f-6lk5b" (UID: "499393fc-abcf-4998-9e32-3d43a0b1e488") : failed to sync secret cache: timed out waiting for the condition Mar 01 09:11:32 crc kubenswrapper[4792]: E0301 09:11:32.868598 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-audit podName:499393fc-abcf-4998-9e32-3d43a0b1e488 nodeName:}" failed. No retries permitted until 2026-03-01 09:11:33.368580746 +0000 UTC m=+222.610460093 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-audit") pod "apiserver-76f77b778f-6lk5b" (UID: "499393fc-abcf-4998-9e32-3d43a0b1e488") : failed to sync configmap cache: timed out waiting for the condition Mar 01 09:11:32 crc kubenswrapper[4792]: E0301 09:11:32.877877 4792 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: failed to sync configmap cache: timed out waiting for the condition Mar 01 09:11:32 crc kubenswrapper[4792]: E0301 09:11:32.878017 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-etcd-serving-ca podName:499393fc-abcf-4998-9e32-3d43a0b1e488 nodeName:}" failed. No retries permitted until 2026-03-01 09:11:33.377977087 +0000 UTC m=+222.619856324 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-etcd-serving-ca") pod "apiserver-76f77b778f-6lk5b" (UID: "499393fc-abcf-4998-9e32-3d43a0b1e488") : failed to sync configmap cache: timed out waiting for the condition Mar 01 09:11:32 crc kubenswrapper[4792]: E0301 09:11:32.879383 4792 configmap.go:193] Couldn't get configMap openshift-apiserver/image-import-ca: failed to sync configmap cache: timed out waiting for the condition Mar 01 09:11:32 crc kubenswrapper[4792]: E0301 09:11:32.879465 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-image-import-ca podName:499393fc-abcf-4998-9e32-3d43a0b1e488 nodeName:}" failed. No retries permitted until 2026-03-01 09:11:33.379443063 +0000 UTC m=+222.621322300 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-import-ca" (UniqueName: "kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-image-import-ca") pod "apiserver-76f77b778f-6lk5b" (UID: "499393fc-abcf-4998-9e32-3d43a0b1e488") : failed to sync configmap cache: timed out waiting for the condition Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.882261 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 01 09:11:32 crc kubenswrapper[4792]: E0301 09:11:32.887431 4792 secret.go:188] Couldn't get secret openshift-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 01 09:11:32 crc kubenswrapper[4792]: E0301 09:11:32.887523 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/499393fc-abcf-4998-9e32-3d43a0b1e488-serving-cert podName:499393fc-abcf-4998-9e32-3d43a0b1e488 nodeName:}" failed. No retries permitted until 2026-03-01 09:11:33.387500621 +0000 UTC m=+222.629379858 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/499393fc-abcf-4998-9e32-3d43a0b1e488-serving-cert") pod "apiserver-76f77b778f-6lk5b" (UID: "499393fc-abcf-4998-9e32-3d43a0b1e488") : failed to sync secret cache: timed out waiting for the condition Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.932423 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnjdm\" (UniqueName: \"kubernetes.io/projected/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-kube-api-access-fnjdm\") pod \"route-controller-manager-6576b87f9c-knb62\" (UID: \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.949892 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8qxf\" (UniqueName: \"kubernetes.io/projected/2311d615-fd4d-43c2-9fcb-8858383c2dc9-kube-api-access-v8qxf\") pod \"machine-approver-56656f9798-lk5qk\" (UID: \"2311d615-fd4d-43c2-9fcb-8858383c2dc9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.955554 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g22l6\" (UniqueName: \"kubernetes.io/projected/e12bfc30-3142-4073-96c7-a377ff6723f7-kube-api-access-g22l6\") pod \"openshift-apiserver-operator-796bbdcf4f-zk9bv\" (UID: \"e12bfc30-3142-4073-96c7-a377ff6723f7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zk9bv" Mar 01 09:11:32 crc kubenswrapper[4792]: I0301 09:11:32.980167 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjp6q\" (UniqueName: \"kubernetes.io/projected/77e0e285-570c-47bd-854e-538c9367486b-kube-api-access-jjp6q\") pod \"cluster-samples-operator-665b6dd947-kq5kp\" (UID: \"77e0e285-570c-47bd-854e-538c9367486b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5kp" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.002363 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wdc9\" (UniqueName: \"kubernetes.io/projected/6cc55bdf-6c0f-4d35-879f-c64c2dc4897c-kube-api-access-6wdc9\") pod \"downloads-7954f5f757-wxl8v\" (UID: \"6cc55bdf-6c0f-4d35-879f-c64c2dc4897c\") " pod="openshift-console/downloads-7954f5f757-wxl8v" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.019317 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn8s2\" (UniqueName: \"kubernetes.io/projected/62e02cd9-3008-41de-b7b7-dc1f546c5645-kube-api-access-hn8s2\") pod \"openshift-controller-manager-operator-756b6f6bc6-s68hg\" (UID: \"62e02cd9-3008-41de-b7b7-dc1f546c5645\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s68hg" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.029292 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.036296 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qmx8\" (UniqueName: \"kubernetes.io/projected/86788093-42e5-4fa0-9595-97a910e6557e-kube-api-access-6qmx8\") pod \"console-f9d7485db-zrzcg\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:33 crc kubenswrapper[4792]: W0301 09:11:33.044338 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2311d615_fd4d_43c2_9fcb_8858383c2dc9.slice/crio-65797377256be8bc6bc0a1ee46963b80d3e126171b7e9e542f02e3a14e38f21f WatchSource:0}: Error finding container 65797377256be8bc6bc0a1ee46963b80d3e126171b7e9e542f02e3a14e38f21f: Status 404 returned error can't find the container with id 65797377256be8bc6bc0a1ee46963b80d3e126171b7e9e542f02e3a14e38f21f Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.053482 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s68hg" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.061978 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.072737 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zk9bv" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.083389 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.102181 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.122991 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.142211 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.162379 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.172650 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.185872 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-wxl8v" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.201479 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.205266 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.212540 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.222842 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.233241 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5kp" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.250263 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.266191 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.307405 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvxvc\" (UniqueName: \"kubernetes.io/projected/6d0571e3-5089-4157-a36a-25ecfe6a67f2-kube-api-access-bvxvc\") pod \"router-default-5444994796-qtg4x\" (UID: \"6d0571e3-5089-4157-a36a-25ecfe6a67f2\") " pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.318616 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55wpx\" (UniqueName: \"kubernetes.io/projected/020a8218-62f4-4abf-a8d2-fed602de5f7f-kube-api-access-55wpx\") pod \"oauth-openshift-558db77b4-prqqp\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.327640 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s68hg"] Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.340127 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spbvd\" (UniqueName: \"kubernetes.io/projected/e0b63d94-59de-45da-8058-89714bea7a90-kube-api-access-spbvd\") pod \"control-plane-machine-set-operator-78cbb6b69f-9smfd\" (UID: \"e0b63d94-59de-45da-8058-89714bea7a90\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9smfd" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.346253 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.365354 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qq5x\" (UniqueName: \"kubernetes.io/projected/8578f8dc-143c-423c-b62b-b3190444bafd-kube-api-access-6qq5x\") pod \"controller-manager-879f6c89f-wl9zt\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.377433 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xslxf\" (UniqueName: \"kubernetes.io/projected/9d95b2fd-64be-4688-a596-c41bb31cb9c4-kube-api-access-xslxf\") pod \"apiserver-7bbb656c7d-977d4\" (UID: \"9d95b2fd-64be-4688-a596-c41bb31cb9c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.396576 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-config\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.396651 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-trusted-ca-bundle\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.396676 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-etcd-serving-ca\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.396750 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-audit\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.396772 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/499393fc-abcf-4998-9e32-3d43a0b1e488-etcd-client\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.396806 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/499393fc-abcf-4998-9e32-3d43a0b1e488-serving-cert\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.396830 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-image-import-ca\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.412021 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpcg5\" (UniqueName: \"kubernetes.io/projected/9ee9e9d4-e788-41cb-b601-035551b5338c-kube-api-access-lpcg5\") pod \"openshift-config-operator-7777fb866f-2l2w7\" (UID: \"9ee9e9d4-e788-41cb-b601-035551b5338c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.419736 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b4xc\" (UniqueName: \"kubernetes.io/projected/1f432b26-7417-4b71-a63a-5cb9a142bd43-kube-api-access-7b4xc\") pod \"authentication-operator-69f744f599-tswcj\" (UID: \"1f432b26-7417-4b71-a63a-5cb9a142bd43\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.422317 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s68hg" event={"ID":"62e02cd9-3008-41de-b7b7-dc1f546c5645","Type":"ContainerStarted","Data":"266c36198a1abe62955aa041f65c04fc576809261770109d31b0041307a83502"} Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.422349 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zk9bv"] Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.437500 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.441068 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk" event={"ID":"2311d615-fd4d-43c2-9fcb-8858383c2dc9","Type":"ContainerStarted","Data":"8b2df384673cae5bf90ee9062b4cc2140974e37edfa75e008601913dd12b843c"} Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.441114 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk" event={"ID":"2311d615-fd4d-43c2-9fcb-8858383c2dc9","Type":"ContainerStarted","Data":"65797377256be8bc6bc0a1ee46963b80d3e126171b7e9e542f02e3a14e38f21f"} Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.441636 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkbg8\" (UniqueName: \"kubernetes.io/projected/0a5ad85c-19b5-432d-aa36-d0db74e44744-kube-api-access-kkbg8\") pod \"machine-config-controller-84d6567774-877gr\" (UID: \"0a5ad85c-19b5-432d-aa36-d0db74e44744\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-877gr" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.464493 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-dbmff\" (UID: \"d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dbmff" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.479654 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6d6h\" (UniqueName: \"kubernetes.io/projected/47d3ee47-7c75-4321-8b9c-5e119a92a311-kube-api-access-f6d6h\") pod \"kube-storage-version-migrator-operator-b67b599dd-7b2lz\" (UID: \"47d3ee47-7c75-4321-8b9c-5e119a92a311\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7b2lz" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.489308 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62"] Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.498845 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qx6z\" (UniqueName: \"kubernetes.io/projected/eeacfd31-08e1-49e6-afda-95efa2d815d2-kube-api-access-8qx6z\") pod \"console-operator-58897d9998-8qrq4\" (UID: \"eeacfd31-08e1-49e6-afda-95efa2d815d2\") " pod="openshift-console-operator/console-operator-58897d9998-8qrq4" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.506346 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5kp"] Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.516612 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn288\" (UniqueName: \"kubernetes.io/projected/51683a24-edad-4808-b2ec-6a628bfdd937-kube-api-access-bn288\") pod \"machine-api-operator-5694c8668f-nv4bp\" (UID: \"51683a24-edad-4808-b2ec-6a628bfdd937\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nv4bp" Mar 01 09:11:33 crc kubenswrapper[4792]: W0301 09:11:33.522871 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44c647cb_a9e2_4e75_abb3_5d3cdbe881a2.slice/crio-c11caa71735c72b6244f64898704ac0350a94ec2fdec7c12cfb52f25b0fabfa9 WatchSource:0}: Error finding container c11caa71735c72b6244f64898704ac0350a94ec2fdec7c12cfb52f25b0fabfa9: Status 404 returned error can't find the container with id c11caa71735c72b6244f64898704ac0350a94ec2fdec7c12cfb52f25b0fabfa9 Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.523581 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.539283 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.541413 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.546802 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-877gr" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.553528 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-wxl8v"] Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.564268 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 01 09:11:33 crc kubenswrapper[4792]: W0301 09:11:33.574005 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cc55bdf_6c0f_4d35_879f_c64c2dc4897c.slice/crio-46206446d1604ae68341dc037b20eac001ae46b9e483ab82dab28549f8b6feab WatchSource:0}: Error finding container 46206446d1604ae68341dc037b20eac001ae46b9e483ab82dab28549f8b6feab: Status 404 returned error can't find the container with id 46206446d1604ae68341dc037b20eac001ae46b9e483ab82dab28549f8b6feab Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.581639 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.582566 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dbmff" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.593241 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-nv4bp" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.602418 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.602565 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.610297 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9smfd" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.622238 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.624528 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.646730 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.651032 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.656485 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-prqqp"] Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.659782 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7b2lz" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.661822 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.681984 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.706714 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.721846 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.744348 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.758177 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-8qrq4" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.767057 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.768685 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7"] Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.782938 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 01 09:11:33 crc kubenswrapper[4792]: W0301 09:11:33.783845 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ee9e9d4_e788_41cb_b601_035551b5338c.slice/crio-b3f78f535e977cde4c2e5ec2a45cfe5751a4144dd17031be1b1eeada78b1af76 WatchSource:0}: Error finding container b3f78f535e977cde4c2e5ec2a45cfe5751a4144dd17031be1b1eeada78b1af76: Status 404 returned error can't find the container with id b3f78f535e977cde4c2e5ec2a45cfe5751a4144dd17031be1b1eeada78b1af76 Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.801596 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-zrzcg"] Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.804640 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.835097 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.848021 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-877gr"] Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.852277 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.861749 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.880179 4792 request.go:700] Waited for 1.848095397s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/configmaps?fieldSelector=metadata.name%3Dcollect-profiles-config&limit=500&resourceVersion=0 Mar 01 09:11:33 crc kubenswrapper[4792]: W0301 09:11:33.883708 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a5ad85c_19b5_432d_aa36_d0db74e44744.slice/crio-6737d4e8211f8c4d682ae34ee5d82c8354f8c966d735715288edc667235636e3 WatchSource:0}: Error finding container 6737d4e8211f8c4d682ae34ee5d82c8354f8c966d735715288edc667235636e3: Status 404 returned error can't find the container with id 6737d4e8211f8c4d682ae34ee5d82c8354f8c966d735715288edc667235636e3 Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.883880 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 01 09:11:33 crc kubenswrapper[4792]: W0301 09:11:33.883982 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86788093_42e5_4fa0_9595_97a910e6557e.slice/crio-d8eca6eb41f8b18cd9f4704945f3f7da8382b69f90c64f182bda36eef645951e WatchSource:0}: Error finding container d8eca6eb41f8b18cd9f4704945f3f7da8382b69f90c64f182bda36eef645951e: Status 404 returned error can't find the container with id d8eca6eb41f8b18cd9f4704945f3f7da8382b69f90c64f182bda36eef645951e Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.901441 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.902062 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wl9zt"] Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.947430 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.955061 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.964992 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 01 09:11:33 crc kubenswrapper[4792]: I0301 09:11:33.988581 4792 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.002777 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.023810 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.044464 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 01 09:11:34 crc kubenswrapper[4792]: E0301 09:11:34.062335 4792 projected.go:288] Couldn't get configMap openshift-apiserver/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 01 09:11:34 crc kubenswrapper[4792]: E0301 09:11:34.062371 4792 projected.go:194] Error preparing data for projected volume kube-api-access-bw279 for pod openshift-apiserver/apiserver-76f77b778f-6lk5b: failed to sync configmap cache: timed out waiting for the condition Mar 01 09:11:34 crc kubenswrapper[4792]: E0301 09:11:34.062440 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/499393fc-abcf-4998-9e32-3d43a0b1e488-kube-api-access-bw279 podName:499393fc-abcf-4998-9e32-3d43a0b1e488 nodeName:}" failed. No retries permitted until 2026-03-01 09:11:34.562419236 +0000 UTC m=+223.804298433 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-bw279" (UniqueName: "kubernetes.io/projected/499393fc-abcf-4998-9e32-3d43a0b1e488-kube-api-access-bw279") pod "apiserver-76f77b778f-6lk5b" (UID: "499393fc-abcf-4998-9e32-3d43a0b1e488") : failed to sync configmap cache: timed out waiting for the condition Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.065072 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.081758 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.103496 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.122827 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.129140 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-tswcj"] Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.131032 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4"] Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.142200 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.219597 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9smfd"] Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.221470 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.221768 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7b2lz"] Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.225523 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.228128 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-trusted-ca-bundle\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.233524 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/499393fc-abcf-4998-9e32-3d43a0b1e488-serving-cert\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.241725 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.243871 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dbmff"] Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.250667 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-etcd-serving-ca\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.258167 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6f49f99d-4119-400a-88d5-6fdf48da4d64-etcd-client\") pod \"etcd-operator-b45778765-q92nw\" (UID: \"6f49f99d-4119-400a-88d5-6fdf48da4d64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.258227 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b1f7190f-8547-4938-8023-708e4891409d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8dzrg\" (UID: \"b1f7190f-8547-4938-8023-708e4891409d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.258276 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6f49f99d-4119-400a-88d5-6fdf48da4d64-etcd-service-ca\") pod \"etcd-operator-b45778765-q92nw\" (UID: \"6f49f99d-4119-400a-88d5-6fdf48da4d64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.258312 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbv6x\" (UniqueName: \"kubernetes.io/projected/6f49f99d-4119-400a-88d5-6fdf48da4d64-kube-api-access-gbv6x\") pod \"etcd-operator-b45778765-q92nw\" (UID: \"6f49f99d-4119-400a-88d5-6fdf48da4d64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.258335 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-trusted-ca\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.258354 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k789z\" (UniqueName: \"kubernetes.io/projected/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-kube-api-access-k789z\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.258380 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.258404 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f49f99d-4119-400a-88d5-6fdf48da4d64-config\") pod \"etcd-operator-b45778765-q92nw\" (UID: \"6f49f99d-4119-400a-88d5-6fdf48da4d64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.258426 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a29641af-98a4-47ca-baca-7e933d7a00d5-config\") pod \"kube-controller-manager-operator-78b949d7b-r9fc7\" (UID: \"a29641af-98a4-47ca-baca-7e933d7a00d5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9fc7" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.258450 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1f7190f-8547-4938-8023-708e4891409d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8dzrg\" (UID: \"b1f7190f-8547-4938-8023-708e4891409d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.258491 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx4p9\" (UniqueName: \"kubernetes.io/projected/b1f7190f-8547-4938-8023-708e4891409d-kube-api-access-kx4p9\") pod \"cluster-image-registry-operator-dc59b4c8b-8dzrg\" (UID: \"b1f7190f-8547-4938-8023-708e4891409d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.258568 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.258599 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-registry-tls\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.258672 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-registry-certificates\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: E0301 09:11:34.259130 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:34.759116848 +0000 UTC m=+224.000996045 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:34 crc kubenswrapper[4792]: W0301 09:11:34.259380 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d95b2fd_64be_4688_a596_c41bb31cb9c4.slice/crio-fcc34692643388dd6dce5229f4741d15ab4fda6bca936d66d57b07de450c074f WatchSource:0}: Error finding container fcc34692643388dd6dce5229f4741d15ab4fda6bca936d66d57b07de450c074f: Status 404 returned error can't find the container with id fcc34692643388dd6dce5229f4741d15ab4fda6bca936d66d57b07de450c074f Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.259432 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a29641af-98a4-47ca-baca-7e933d7a00d5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-r9fc7\" (UID: \"a29641af-98a4-47ca-baca-7e933d7a00d5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9fc7" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.259551 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d566570d-4f58-487b-b824-839792e88650-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-smnq2\" (UID: \"d566570d-4f58-487b-b824-839792e88650\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-smnq2" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.259571 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb3b55fa-972b-4231-8445-bd4cd9a8b88b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8dwc4\" (UID: \"fb3b55fa-972b-4231-8445-bd4cd9a8b88b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dwc4" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.259593 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fb3b55fa-972b-4231-8445-bd4cd9a8b88b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8dwc4\" (UID: \"fb3b55fa-972b-4231-8445-bd4cd9a8b88b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dwc4" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.259616 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb3b55fa-972b-4231-8445-bd4cd9a8b88b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8dwc4\" (UID: \"fb3b55fa-972b-4231-8445-bd4cd9a8b88b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dwc4" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.259626 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-nv4bp"] Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.259643 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.259657 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-bound-sa-token\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.259891 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b1f7190f-8547-4938-8023-708e4891409d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8dzrg\" (UID: \"b1f7190f-8547-4938-8023-708e4891409d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.260125 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6f49f99d-4119-400a-88d5-6fdf48da4d64-etcd-ca\") pod \"etcd-operator-b45778765-q92nw\" (UID: \"6f49f99d-4119-400a-88d5-6fdf48da4d64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.260163 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f49f99d-4119-400a-88d5-6fdf48da4d64-serving-cert\") pod \"etcd-operator-b45778765-q92nw\" (UID: \"6f49f99d-4119-400a-88d5-6fdf48da4d64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.260182 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a29641af-98a4-47ca-baca-7e933d7a00d5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-r9fc7\" (UID: \"a29641af-98a4-47ca-baca-7e933d7a00d5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9fc7" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.260214 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmb5d\" (UniqueName: \"kubernetes.io/projected/d566570d-4f58-487b-b824-839792e88650-kube-api-access-jmb5d\") pod \"multus-admission-controller-857f4d67dd-smnq2\" (UID: \"d566570d-4f58-487b-b824-839792e88650\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-smnq2" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.262620 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.270588 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-config\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.281474 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.282671 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-8qrq4"] Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.290791 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/499393fc-abcf-4998-9e32-3d43a0b1e488-etcd-client\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.301277 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.320816 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.328480 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-image-import-ca\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.342080 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.347419 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/499393fc-abcf-4998-9e32-3d43a0b1e488-audit\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.360638 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.360735 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9b2af767-57b1-4774-9668-6610e9ac1bb9-socket-dir\") pod \"csi-hostpathplugin-64dsw\" (UID: \"9b2af767-57b1-4774-9668-6610e9ac1bb9\") " pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: E0301 09:11:34.360786 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:34.86074861 +0000 UTC m=+224.102627807 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.360821 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03389f1b-2d84-4b8b-879d-545498a154cc-config\") pod \"service-ca-operator-777779d784-6rmwm\" (UID: \"03389f1b-2d84-4b8b-879d-545498a154cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6rmwm" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.360851 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd837cd0-c714-48e1-8771-cc6c419f7639-cert\") pod \"ingress-canary-dgh8q\" (UID: \"bd837cd0-c714-48e1-8771-cc6c419f7639\") " pod="openshift-ingress-canary/ingress-canary-dgh8q" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.360854 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.360918 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d566570d-4f58-487b-b824-839792e88650-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-smnq2\" (UID: \"d566570d-4f58-487b-b824-839792e88650\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-smnq2" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.360946 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb3b55fa-972b-4231-8445-bd4cd9a8b88b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8dwc4\" (UID: \"fb3b55fa-972b-4231-8445-bd4cd9a8b88b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dwc4" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.360972 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b72d8baa-f3f8-4263-a36d-9741ad4243d5-tmpfs\") pod \"packageserver-d55dfcdfc-4jk5c\" (UID: \"b72d8baa-f3f8-4263-a36d-9741ad4243d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361004 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh4rn\" (UniqueName: \"kubernetes.io/projected/d25464be-fe72-4409-a934-9e8c70542ed6-kube-api-access-rh4rn\") pod \"service-ca-9c57cc56f-r7d4f\" (UID: \"d25464be-fe72-4409-a934-9e8c70542ed6\") " pod="openshift-service-ca/service-ca-9c57cc56f-r7d4f" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361062 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb3b55fa-972b-4231-8445-bd4cd9a8b88b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8dwc4\" (UID: \"fb3b55fa-972b-4231-8445-bd4cd9a8b88b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dwc4" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361084 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fb3b55fa-972b-4231-8445-bd4cd9a8b88b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8dwc4\" (UID: \"fb3b55fa-972b-4231-8445-bd4cd9a8b88b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dwc4" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361105 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2z9s\" (UniqueName: \"kubernetes.io/projected/03389f1b-2d84-4b8b-879d-545498a154cc-kube-api-access-f2z9s\") pod \"service-ca-operator-777779d784-6rmwm\" (UID: \"03389f1b-2d84-4b8b-879d-545498a154cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6rmwm" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361151 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cc60e4e1-1b94-4913-879c-fbd25ff314b9-srv-cert\") pod \"olm-operator-6b444d44fb-qsc9d\" (UID: \"cc60e4e1-1b94-4913-879c-fbd25ff314b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361169 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppw5j\" (UniqueName: \"kubernetes.io/projected/bd837cd0-c714-48e1-8771-cc6c419f7639-kube-api-access-ppw5j\") pod \"ingress-canary-dgh8q\" (UID: \"bd837cd0-c714-48e1-8771-cc6c419f7639\") " pod="openshift-ingress-canary/ingress-canary-dgh8q" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361231 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361268 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-bound-sa-token\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361291 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cc60e4e1-1b94-4913-879c-fbd25ff314b9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qsc9d\" (UID: \"cc60e4e1-1b94-4913-879c-fbd25ff314b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361336 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b1f7190f-8547-4938-8023-708e4891409d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8dzrg\" (UID: \"b1f7190f-8547-4938-8023-708e4891409d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361356 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/04b15432-c193-4b0c-b527-df9a9b37c886-images\") pod \"machine-config-operator-74547568cd-ffd8l\" (UID: \"04b15432-c193-4b0c-b527-df9a9b37c886\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361393 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b72d8baa-f3f8-4263-a36d-9741ad4243d5-apiservice-cert\") pod \"packageserver-d55dfcdfc-4jk5c\" (UID: \"b72d8baa-f3f8-4263-a36d-9741ad4243d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361415 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3f7932d3-c8c1-4f66-94fb-ea1a45b46889-node-bootstrap-token\") pod \"machine-config-server-glj9p\" (UID: \"3f7932d3-c8c1-4f66-94fb-ea1a45b46889\") " pod="openshift-machine-config-operator/machine-config-server-glj9p" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361470 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6f49f99d-4119-400a-88d5-6fdf48da4d64-etcd-ca\") pod \"etcd-operator-b45778765-q92nw\" (UID: \"6f49f99d-4119-400a-88d5-6fdf48da4d64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361508 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/011a4c5f-1a18-4f0d-884f-43bb6477efb6-trusted-ca\") pod \"ingress-operator-5b745b69d9-vgrb8\" (UID: \"011a4c5f-1a18-4f0d-884f-43bb6477efb6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361527 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/011a4c5f-1a18-4f0d-884f-43bb6477efb6-metrics-tls\") pod \"ingress-operator-5b745b69d9-vgrb8\" (UID: \"011a4c5f-1a18-4f0d-884f-43bb6477efb6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361544 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/644b4b74-7ce9-4d36-8938-58a1e2b2b49f-metrics-tls\") pod \"dns-operator-744455d44c-wpgg2\" (UID: \"644b4b74-7ce9-4d36-8938-58a1e2b2b49f\") " pod="openshift-dns-operator/dns-operator-744455d44c-wpgg2" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361570 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f49f99d-4119-400a-88d5-6fdf48da4d64-serving-cert\") pod \"etcd-operator-b45778765-q92nw\" (UID: \"6f49f99d-4119-400a-88d5-6fdf48da4d64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361593 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a29641af-98a4-47ca-baca-7e933d7a00d5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-r9fc7\" (UID: \"a29641af-98a4-47ca-baca-7e933d7a00d5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9fc7" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361660 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e37e6dcb-be13-4787-8555-3ba1050f7b77-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gk6c6\" (UID: \"e37e6dcb-be13-4787-8555-3ba1050f7b77\") " pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361714 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmb5d\" (UniqueName: \"kubernetes.io/projected/d566570d-4f58-487b-b824-839792e88650-kube-api-access-jmb5d\") pod \"multus-admission-controller-857f4d67dd-smnq2\" (UID: \"d566570d-4f58-487b-b824-839792e88650\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-smnq2" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361750 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bntn7\" (UniqueName: \"kubernetes.io/projected/011a4c5f-1a18-4f0d-884f-43bb6477efb6-kube-api-access-bntn7\") pod \"ingress-operator-5b745b69d9-vgrb8\" (UID: \"011a4c5f-1a18-4f0d-884f-43bb6477efb6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361775 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdrrj\" (UniqueName: \"kubernetes.io/projected/3f7932d3-c8c1-4f66-94fb-ea1a45b46889-kube-api-access-rdrrj\") pod \"machine-config-server-glj9p\" (UID: \"3f7932d3-c8c1-4f66-94fb-ea1a45b46889\") " pod="openshift-machine-config-operator/machine-config-server-glj9p" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361802 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9c283b49-5e58-4c99-97c2-d53ab428265f-config-volume\") pod \"dns-default-rjwhk\" (UID: \"9c283b49-5e58-4c99-97c2-d53ab428265f\") " pod="openshift-dns/dns-default-rjwhk" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361838 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvmn4\" (UniqueName: \"kubernetes.io/projected/cc60e4e1-1b94-4913-879c-fbd25ff314b9-kube-api-access-pvmn4\") pod \"olm-operator-6b444d44fb-qsc9d\" (UID: \"cc60e4e1-1b94-4913-879c-fbd25ff314b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361863 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3-secret-volume\") pod \"collect-profiles-29539260-g6qtn\" (UID: \"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.361960 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9c283b49-5e58-4c99-97c2-d53ab428265f-metrics-tls\") pod \"dns-default-rjwhk\" (UID: \"9c283b49-5e58-4c99-97c2-d53ab428265f\") " pod="openshift-dns/dns-default-rjwhk" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362022 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/04b15432-c193-4b0c-b527-df9a9b37c886-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ffd8l\" (UID: \"04b15432-c193-4b0c-b527-df9a9b37c886\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362077 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7pkt\" (UniqueName: \"kubernetes.io/projected/e37e6dcb-be13-4787-8555-3ba1050f7b77-kube-api-access-r7pkt\") pod \"marketplace-operator-79b997595-gk6c6\" (UID: \"e37e6dcb-be13-4787-8555-3ba1050f7b77\") " pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362121 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6f49f99d-4119-400a-88d5-6fdf48da4d64-etcd-client\") pod \"etcd-operator-b45778765-q92nw\" (UID: \"6f49f99d-4119-400a-88d5-6fdf48da4d64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362162 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5c679376-bb09-4944-b4ee-3710661612b5-srv-cert\") pod \"catalog-operator-68c6474976-7l7sj\" (UID: \"5c679376-bb09-4944-b4ee-3710661612b5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362186 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9b2af767-57b1-4774-9668-6610e9ac1bb9-registration-dir\") pod \"csi-hostpathplugin-64dsw\" (UID: \"9b2af767-57b1-4774-9668-6610e9ac1bb9\") " pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362227 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03389f1b-2d84-4b8b-879d-545498a154cc-serving-cert\") pod \"service-ca-operator-777779d784-6rmwm\" (UID: \"03389f1b-2d84-4b8b-879d-545498a154cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6rmwm" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362252 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdlfc\" (UniqueName: \"kubernetes.io/projected/644b4b74-7ce9-4d36-8938-58a1e2b2b49f-kube-api-access-jdlfc\") pod \"dns-operator-744455d44c-wpgg2\" (UID: \"644b4b74-7ce9-4d36-8938-58a1e2b2b49f\") " pod="openshift-dns-operator/dns-operator-744455d44c-wpgg2" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362280 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb3b55fa-972b-4231-8445-bd4cd9a8b88b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8dwc4\" (UID: \"fb3b55fa-972b-4231-8445-bd4cd9a8b88b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dwc4" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362291 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b6np\" (UniqueName: \"kubernetes.io/projected/b72d8baa-f3f8-4263-a36d-9741ad4243d5-kube-api-access-6b6np\") pod \"packageserver-d55dfcdfc-4jk5c\" (UID: \"b72d8baa-f3f8-4263-a36d-9741ad4243d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362366 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3-config-volume\") pod \"collect-profiles-29539260-g6qtn\" (UID: \"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362395 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9b2af767-57b1-4774-9668-6610e9ac1bb9-mountpoint-dir\") pod \"csi-hostpathplugin-64dsw\" (UID: \"9b2af767-57b1-4774-9668-6610e9ac1bb9\") " pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362432 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b72d8baa-f3f8-4263-a36d-9741ad4243d5-webhook-cert\") pod \"packageserver-d55dfcdfc-4jk5c\" (UID: \"b72d8baa-f3f8-4263-a36d-9741ad4243d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362481 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/011a4c5f-1a18-4f0d-884f-43bb6477efb6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vgrb8\" (UID: \"011a4c5f-1a18-4f0d-884f-43bb6477efb6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362505 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3f7932d3-c8c1-4f66-94fb-ea1a45b46889-certs\") pod \"machine-config-server-glj9p\" (UID: \"3f7932d3-c8c1-4f66-94fb-ea1a45b46889\") " pod="openshift-machine-config-operator/machine-config-server-glj9p" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362546 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e37e6dcb-be13-4787-8555-3ba1050f7b77-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gk6c6\" (UID: \"e37e6dcb-be13-4787-8555-3ba1050f7b77\") " pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362598 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b1f7190f-8547-4938-8023-708e4891409d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8dzrg\" (UID: \"b1f7190f-8547-4938-8023-708e4891409d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362630 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d25464be-fe72-4409-a934-9e8c70542ed6-signing-key\") pod \"service-ca-9c57cc56f-r7d4f\" (UID: \"d25464be-fe72-4409-a934-9e8c70542ed6\") " pod="openshift-service-ca/service-ca-9c57cc56f-r7d4f" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362669 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6f49f99d-4119-400a-88d5-6fdf48da4d64-etcd-service-ca\") pod \"etcd-operator-b45778765-q92nw\" (UID: \"6f49f99d-4119-400a-88d5-6fdf48da4d64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362693 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ab8f156-05d7-47d8-b849-a49f1c5cf03b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-nc4gs\" (UID: \"9ab8f156-05d7-47d8-b849-a49f1c5cf03b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nc4gs" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362717 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdjxb\" (UniqueName: \"kubernetes.io/projected/9c283b49-5e58-4c99-97c2-d53ab428265f-kube-api-access-pdjxb\") pod \"dns-default-rjwhk\" (UID: \"9c283b49-5e58-4c99-97c2-d53ab428265f\") " pod="openshift-dns/dns-default-rjwhk" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362744 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbv6x\" (UniqueName: \"kubernetes.io/projected/6f49f99d-4119-400a-88d5-6fdf48da4d64-kube-api-access-gbv6x\") pod \"etcd-operator-b45778765-q92nw\" (UID: \"6f49f99d-4119-400a-88d5-6fdf48da4d64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362768 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzw4q\" (UniqueName: \"kubernetes.io/projected/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3-kube-api-access-lzw4q\") pod \"collect-profiles-29539260-g6qtn\" (UID: \"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362818 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-trusted-ca\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362862 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k789z\" (UniqueName: \"kubernetes.io/projected/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-kube-api-access-k789z\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362919 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.362951 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f49f99d-4119-400a-88d5-6fdf48da4d64-config\") pod \"etcd-operator-b45778765-q92nw\" (UID: \"6f49f99d-4119-400a-88d5-6fdf48da4d64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363002 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9b2af767-57b1-4774-9668-6610e9ac1bb9-csi-data-dir\") pod \"csi-hostpathplugin-64dsw\" (UID: \"9b2af767-57b1-4774-9668-6610e9ac1bb9\") " pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363054 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a29641af-98a4-47ca-baca-7e933d7a00d5-config\") pod \"kube-controller-manager-operator-78b949d7b-r9fc7\" (UID: \"a29641af-98a4-47ca-baca-7e933d7a00d5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9fc7" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363080 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1f7190f-8547-4938-8023-708e4891409d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8dzrg\" (UID: \"b1f7190f-8547-4938-8023-708e4891409d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363118 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx4p9\" (UniqueName: \"kubernetes.io/projected/b1f7190f-8547-4938-8023-708e4891409d-kube-api-access-kx4p9\") pod \"cluster-image-registry-operator-dc59b4c8b-8dzrg\" (UID: \"b1f7190f-8547-4938-8023-708e4891409d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363141 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkjx7\" (UniqueName: \"kubernetes.io/projected/9ab8f156-05d7-47d8-b849-a49f1c5cf03b-kube-api-access-bkjx7\") pod \"package-server-manager-789f6589d5-nc4gs\" (UID: \"9ab8f156-05d7-47d8-b849-a49f1c5cf03b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nc4gs" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363195 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363279 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-registry-tls\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363311 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql5bq\" (UniqueName: \"kubernetes.io/projected/b4130507-2de2-48c2-9c3f-e9474aeca556-kube-api-access-ql5bq\") pod \"auto-csr-approver-29539270-q7hck\" (UID: \"b4130507-2de2-48c2-9c3f-e9474aeca556\") " pod="openshift-infra/auto-csr-approver-29539270-q7hck" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363345 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363612 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d25464be-fe72-4409-a934-9e8c70542ed6-signing-cabundle\") pod \"service-ca-9c57cc56f-r7d4f\" (UID: \"d25464be-fe72-4409-a934-9e8c70542ed6\") " pod="openshift-service-ca/service-ca-9c57cc56f-r7d4f" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363650 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-registry-certificates\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363693 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwl7l\" (UniqueName: \"kubernetes.io/projected/04b15432-c193-4b0c-b527-df9a9b37c886-kube-api-access-xwl7l\") pod \"machine-config-operator-74547568cd-ffd8l\" (UID: \"04b15432-c193-4b0c-b527-df9a9b37c886\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363719 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5c679376-bb09-4944-b4ee-3710661612b5-profile-collector-cert\") pod \"catalog-operator-68c6474976-7l7sj\" (UID: \"5c679376-bb09-4944-b4ee-3710661612b5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363745 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52bbw\" (UniqueName: \"kubernetes.io/projected/5c679376-bb09-4944-b4ee-3710661612b5-kube-api-access-52bbw\") pod \"catalog-operator-68c6474976-7l7sj\" (UID: \"5c679376-bb09-4944-b4ee-3710661612b5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363765 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d566570d-4f58-487b-b824-839792e88650-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-smnq2\" (UID: \"d566570d-4f58-487b-b824-839792e88650\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-smnq2" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363818 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a29641af-98a4-47ca-baca-7e933d7a00d5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-r9fc7\" (UID: \"a29641af-98a4-47ca-baca-7e933d7a00d5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9fc7" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363843 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/04b15432-c193-4b0c-b527-df9a9b37c886-proxy-tls\") pod \"machine-config-operator-74547568cd-ffd8l\" (UID: \"04b15432-c193-4b0c-b527-df9a9b37c886\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363867 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xv7w\" (UniqueName: \"kubernetes.io/projected/9b2af767-57b1-4774-9668-6610e9ac1bb9-kube-api-access-6xv7w\") pod \"csi-hostpathplugin-64dsw\" (UID: \"9b2af767-57b1-4774-9668-6610e9ac1bb9\") " pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363937 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9b2af767-57b1-4774-9668-6610e9ac1bb9-plugins-dir\") pod \"csi-hostpathplugin-64dsw\" (UID: \"9b2af767-57b1-4774-9668-6610e9ac1bb9\") " pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.363966 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs8s7\" (UniqueName: \"kubernetes.io/projected/f3480f1b-eedb-4bc8-b40f-5c527869096a-kube-api-access-hs8s7\") pod \"migrator-59844c95c7-kxx8s\" (UID: \"f3480f1b-eedb-4bc8-b40f-5c527869096a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kxx8s" Mar 01 09:11:34 crc kubenswrapper[4792]: E0301 09:11:34.365318 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:34.865305492 +0000 UTC m=+224.107184689 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.365518 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb3b55fa-972b-4231-8445-bd4cd9a8b88b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8dwc4\" (UID: \"fb3b55fa-972b-4231-8445-bd4cd9a8b88b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dwc4" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.366056 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f49f99d-4119-400a-88d5-6fdf48da4d64-config\") pod \"etcd-operator-b45778765-q92nw\" (UID: \"6f49f99d-4119-400a-88d5-6fdf48da4d64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.366796 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6f49f99d-4119-400a-88d5-6fdf48da4d64-etcd-ca\") pod \"etcd-operator-b45778765-q92nw\" (UID: \"6f49f99d-4119-400a-88d5-6fdf48da4d64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.367352 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-registry-certificates\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.367522 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a29641af-98a4-47ca-baca-7e933d7a00d5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-r9fc7\" (UID: \"a29641af-98a4-47ca-baca-7e933d7a00d5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9fc7" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.368282 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6f49f99d-4119-400a-88d5-6fdf48da4d64-etcd-service-ca\") pod \"etcd-operator-b45778765-q92nw\" (UID: \"6f49f99d-4119-400a-88d5-6fdf48da4d64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.368474 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a29641af-98a4-47ca-baca-7e933d7a00d5-config\") pod \"kube-controller-manager-operator-78b949d7b-r9fc7\" (UID: \"a29641af-98a4-47ca-baca-7e933d7a00d5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9fc7" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.369168 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b1f7190f-8547-4938-8023-708e4891409d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8dzrg\" (UID: \"b1f7190f-8547-4938-8023-708e4891409d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.369615 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1f7190f-8547-4938-8023-708e4891409d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8dzrg\" (UID: \"b1f7190f-8547-4938-8023-708e4891409d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.370369 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6f49f99d-4119-400a-88d5-6fdf48da4d64-etcd-client\") pod \"etcd-operator-b45778765-q92nw\" (UID: \"6f49f99d-4119-400a-88d5-6fdf48da4d64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.370698 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f49f99d-4119-400a-88d5-6fdf48da4d64-serving-cert\") pod \"etcd-operator-b45778765-q92nw\" (UID: \"6f49f99d-4119-400a-88d5-6fdf48da4d64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.413075 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-trusted-ca\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.413219 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.413586 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-registry-tls\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: W0301 09:11:34.418055 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51683a24_edad_4808_b2ec_6a628bfdd937.slice/crio-9bdf109597e2093483ee18245241f28df0a563ea6bae665bae2821dc90890d2e WatchSource:0}: Error finding container 9bdf109597e2093483ee18245241f28df0a563ea6bae665bae2821dc90890d2e: Status 404 returned error can't find the container with id 9bdf109597e2093483ee18245241f28df0a563ea6bae665bae2821dc90890d2e Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.418448 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b1f7190f-8547-4938-8023-708e4891409d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8dzrg\" (UID: \"b1f7190f-8547-4938-8023-708e4891409d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.435766 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmb5d\" (UniqueName: \"kubernetes.io/projected/d566570d-4f58-487b-b824-839792e88650-kube-api-access-jmb5d\") pod \"multus-admission-controller-857f4d67dd-smnq2\" (UID: \"d566570d-4f58-487b-b824-839792e88650\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-smnq2" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.448503 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" event={"ID":"8578f8dc-143c-423c-b62b-b3190444bafd","Type":"ContainerStarted","Data":"452f21dc7923df996fc4ebcc58043ac03b69b3315c7778ffe5676b68b45c4e4f"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.449808 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zk9bv" event={"ID":"e12bfc30-3142-4073-96c7-a377ff6723f7","Type":"ContainerStarted","Data":"0e23e8c1bc8596bfef55378e85c51963d81b7cd8ed6c35c790cb0e766fb2db0e"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.449835 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zk9bv" event={"ID":"e12bfc30-3142-4073-96c7-a377ff6723f7","Type":"ContainerStarted","Data":"9a36e7fd9cd6114fc82268c71c363e9c165cf82457d0d5553c7a776d50b2b6e4"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.451172 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9smfd" event={"ID":"e0b63d94-59de-45da-8058-89714bea7a90","Type":"ContainerStarted","Data":"bd38c09d69ba47ec9ef0c003938aaf7f3ee7204fc6c25ebfda3154487ebc28ca"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.452245 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-877gr" event={"ID":"0a5ad85c-19b5-432d-aa36-d0db74e44744","Type":"ContainerStarted","Data":"6737d4e8211f8c4d682ae34ee5d82c8354f8c966d735715288edc667235636e3"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.454883 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk" event={"ID":"2311d615-fd4d-43c2-9fcb-8858383c2dc9","Type":"ContainerStarted","Data":"502c112c1bd8a34f51ea3e1ea353eca15fa9a9b3aa6b54edd787bb3e1c2118d4"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.455839 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-bound-sa-token\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.456706 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-wxl8v" event={"ID":"6cc55bdf-6c0f-4d35-879f-c64c2dc4897c","Type":"ContainerStarted","Data":"7b9215af955101b19487a965042f4a5f46cb6e7c587c700e7e568a19ff7442d6"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.456736 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-wxl8v" event={"ID":"6cc55bdf-6c0f-4d35-879f-c64c2dc4897c","Type":"ContainerStarted","Data":"46206446d1604ae68341dc037b20eac001ae46b9e483ab82dab28549f8b6feab"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.458049 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zrzcg" event={"ID":"86788093-42e5-4fa0-9595-97a910e6557e","Type":"ContainerStarted","Data":"d8eca6eb41f8b18cd9f4704945f3f7da8382b69f90c64f182bda36eef645951e"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.458873 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" event={"ID":"9d95b2fd-64be-4688-a596-c41bb31cb9c4","Type":"ContainerStarted","Data":"fcc34692643388dd6dce5229f4741d15ab4fda6bca936d66d57b07de450c074f"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.459630 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dbmff" event={"ID":"d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842","Type":"ContainerStarted","Data":"fa97e4696280686943f690b7aa07cba997686cb8ae8cbc98143a0bc6a19ba3f1"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.461502 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" event={"ID":"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2","Type":"ContainerStarted","Data":"ee95da5910e6e2c13434125c99652d9a77af8d0b5457ae7a7ce20a9282ee3b00"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.461552 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" event={"ID":"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2","Type":"ContainerStarted","Data":"c11caa71735c72b6244f64898704ac0350a94ec2fdec7c12cfb52f25b0fabfa9"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.461683 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.462721 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s68hg" event={"ID":"62e02cd9-3008-41de-b7b7-dc1f546c5645","Type":"ContainerStarted","Data":"7fc2d832ee54019ed03f6c7bbda197a07d9d4784bbe5f2d4f6eb053289222e48"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.463710 4792 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-knb62 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.463743 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" podUID="44c647cb-a9e2-4e75-abb3-5d3cdbe881a2" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.463890 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" event={"ID":"1f432b26-7417-4b71-a63a-5cb9a142bd43","Type":"ContainerStarted","Data":"05403abb1063104f17831c789351343da38d5a852d04a7591b0f89dc5c068a3f"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464339 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464489 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d25464be-fe72-4409-a934-9e8c70542ed6-signing-cabundle\") pod \"service-ca-9c57cc56f-r7d4f\" (UID: \"d25464be-fe72-4409-a934-9e8c70542ed6\") " pod="openshift-service-ca/service-ca-9c57cc56f-r7d4f" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464516 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52bbw\" (UniqueName: \"kubernetes.io/projected/5c679376-bb09-4944-b4ee-3710661612b5-kube-api-access-52bbw\") pod \"catalog-operator-68c6474976-7l7sj\" (UID: \"5c679376-bb09-4944-b4ee-3710661612b5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464547 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwl7l\" (UniqueName: \"kubernetes.io/projected/04b15432-c193-4b0c-b527-df9a9b37c886-kube-api-access-xwl7l\") pod \"machine-config-operator-74547568cd-ffd8l\" (UID: \"04b15432-c193-4b0c-b527-df9a9b37c886\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464570 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5c679376-bb09-4944-b4ee-3710661612b5-profile-collector-cert\") pod \"catalog-operator-68c6474976-7l7sj\" (UID: \"5c679376-bb09-4944-b4ee-3710661612b5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464592 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/04b15432-c193-4b0c-b527-df9a9b37c886-proxy-tls\") pod \"machine-config-operator-74547568cd-ffd8l\" (UID: \"04b15432-c193-4b0c-b527-df9a9b37c886\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464615 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xv7w\" (UniqueName: \"kubernetes.io/projected/9b2af767-57b1-4774-9668-6610e9ac1bb9-kube-api-access-6xv7w\") pod \"csi-hostpathplugin-64dsw\" (UID: \"9b2af767-57b1-4774-9668-6610e9ac1bb9\") " pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464654 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9b2af767-57b1-4774-9668-6610e9ac1bb9-plugins-dir\") pod \"csi-hostpathplugin-64dsw\" (UID: \"9b2af767-57b1-4774-9668-6610e9ac1bb9\") " pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464675 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs8s7\" (UniqueName: \"kubernetes.io/projected/f3480f1b-eedb-4bc8-b40f-5c527869096a-kube-api-access-hs8s7\") pod \"migrator-59844c95c7-kxx8s\" (UID: \"f3480f1b-eedb-4bc8-b40f-5c527869096a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kxx8s" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464717 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd837cd0-c714-48e1-8771-cc6c419f7639-cert\") pod \"ingress-canary-dgh8q\" (UID: \"bd837cd0-c714-48e1-8771-cc6c419f7639\") " pod="openshift-ingress-canary/ingress-canary-dgh8q" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464739 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9b2af767-57b1-4774-9668-6610e9ac1bb9-socket-dir\") pod \"csi-hostpathplugin-64dsw\" (UID: \"9b2af767-57b1-4774-9668-6610e9ac1bb9\") " pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464759 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03389f1b-2d84-4b8b-879d-545498a154cc-config\") pod \"service-ca-operator-777779d784-6rmwm\" (UID: \"03389f1b-2d84-4b8b-879d-545498a154cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6rmwm" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464796 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b72d8baa-f3f8-4263-a36d-9741ad4243d5-tmpfs\") pod \"packageserver-d55dfcdfc-4jk5c\" (UID: \"b72d8baa-f3f8-4263-a36d-9741ad4243d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464816 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh4rn\" (UniqueName: \"kubernetes.io/projected/d25464be-fe72-4409-a934-9e8c70542ed6-kube-api-access-rh4rn\") pod \"service-ca-9c57cc56f-r7d4f\" (UID: \"d25464be-fe72-4409-a934-9e8c70542ed6\") " pod="openshift-service-ca/service-ca-9c57cc56f-r7d4f" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464856 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2z9s\" (UniqueName: \"kubernetes.io/projected/03389f1b-2d84-4b8b-879d-545498a154cc-kube-api-access-f2z9s\") pod \"service-ca-operator-777779d784-6rmwm\" (UID: \"03389f1b-2d84-4b8b-879d-545498a154cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6rmwm" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464880 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cc60e4e1-1b94-4913-879c-fbd25ff314b9-srv-cert\") pod \"olm-operator-6b444d44fb-qsc9d\" (UID: \"cc60e4e1-1b94-4913-879c-fbd25ff314b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464927 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppw5j\" (UniqueName: \"kubernetes.io/projected/bd837cd0-c714-48e1-8771-cc6c419f7639-kube-api-access-ppw5j\") pod \"ingress-canary-dgh8q\" (UID: \"bd837cd0-c714-48e1-8771-cc6c419f7639\") " pod="openshift-ingress-canary/ingress-canary-dgh8q" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464970 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cc60e4e1-1b94-4913-879c-fbd25ff314b9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qsc9d\" (UID: \"cc60e4e1-1b94-4913-879c-fbd25ff314b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.464994 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/04b15432-c193-4b0c-b527-df9a9b37c886-images\") pod \"machine-config-operator-74547568cd-ffd8l\" (UID: \"04b15432-c193-4b0c-b527-df9a9b37c886\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465018 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b72d8baa-f3f8-4263-a36d-9741ad4243d5-apiservice-cert\") pod \"packageserver-d55dfcdfc-4jk5c\" (UID: \"b72d8baa-f3f8-4263-a36d-9741ad4243d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465039 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3f7932d3-c8c1-4f66-94fb-ea1a45b46889-node-bootstrap-token\") pod \"machine-config-server-glj9p\" (UID: \"3f7932d3-c8c1-4f66-94fb-ea1a45b46889\") " pod="openshift-machine-config-operator/machine-config-server-glj9p" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465075 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/011a4c5f-1a18-4f0d-884f-43bb6477efb6-trusted-ca\") pod \"ingress-operator-5b745b69d9-vgrb8\" (UID: \"011a4c5f-1a18-4f0d-884f-43bb6477efb6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465099 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/011a4c5f-1a18-4f0d-884f-43bb6477efb6-metrics-tls\") pod \"ingress-operator-5b745b69d9-vgrb8\" (UID: \"011a4c5f-1a18-4f0d-884f-43bb6477efb6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465120 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/644b4b74-7ce9-4d36-8938-58a1e2b2b49f-metrics-tls\") pod \"dns-operator-744455d44c-wpgg2\" (UID: \"644b4b74-7ce9-4d36-8938-58a1e2b2b49f\") " pod="openshift-dns-operator/dns-operator-744455d44c-wpgg2" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465157 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e37e6dcb-be13-4787-8555-3ba1050f7b77-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gk6c6\" (UID: \"e37e6dcb-be13-4787-8555-3ba1050f7b77\") " pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465181 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bntn7\" (UniqueName: \"kubernetes.io/projected/011a4c5f-1a18-4f0d-884f-43bb6477efb6-kube-api-access-bntn7\") pod \"ingress-operator-5b745b69d9-vgrb8\" (UID: \"011a4c5f-1a18-4f0d-884f-43bb6477efb6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465203 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdrrj\" (UniqueName: \"kubernetes.io/projected/3f7932d3-c8c1-4f66-94fb-ea1a45b46889-kube-api-access-rdrrj\") pod \"machine-config-server-glj9p\" (UID: \"3f7932d3-c8c1-4f66-94fb-ea1a45b46889\") " pod="openshift-machine-config-operator/machine-config-server-glj9p" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465222 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9c283b49-5e58-4c99-97c2-d53ab428265f-config-volume\") pod \"dns-default-rjwhk\" (UID: \"9c283b49-5e58-4c99-97c2-d53ab428265f\") " pod="openshift-dns/dns-default-rjwhk" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465245 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvmn4\" (UniqueName: \"kubernetes.io/projected/cc60e4e1-1b94-4913-879c-fbd25ff314b9-kube-api-access-pvmn4\") pod \"olm-operator-6b444d44fb-qsc9d\" (UID: \"cc60e4e1-1b94-4913-879c-fbd25ff314b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465275 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3-secret-volume\") pod \"collect-profiles-29539260-g6qtn\" (UID: \"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465296 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9c283b49-5e58-4c99-97c2-d53ab428265f-metrics-tls\") pod \"dns-default-rjwhk\" (UID: \"9c283b49-5e58-4c99-97c2-d53ab428265f\") " pod="openshift-dns/dns-default-rjwhk" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465328 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/04b15432-c193-4b0c-b527-df9a9b37c886-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ffd8l\" (UID: \"04b15432-c193-4b0c-b527-df9a9b37c886\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465351 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7pkt\" (UniqueName: \"kubernetes.io/projected/e37e6dcb-be13-4787-8555-3ba1050f7b77-kube-api-access-r7pkt\") pod \"marketplace-operator-79b997595-gk6c6\" (UID: \"e37e6dcb-be13-4787-8555-3ba1050f7b77\") " pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465384 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5c679376-bb09-4944-b4ee-3710661612b5-srv-cert\") pod \"catalog-operator-68c6474976-7l7sj\" (UID: \"5c679376-bb09-4944-b4ee-3710661612b5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465405 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9b2af767-57b1-4774-9668-6610e9ac1bb9-registration-dir\") pod \"csi-hostpathplugin-64dsw\" (UID: \"9b2af767-57b1-4774-9668-6610e9ac1bb9\") " pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465427 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03389f1b-2d84-4b8b-879d-545498a154cc-serving-cert\") pod \"service-ca-operator-777779d784-6rmwm\" (UID: \"03389f1b-2d84-4b8b-879d-545498a154cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6rmwm" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465449 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdlfc\" (UniqueName: \"kubernetes.io/projected/644b4b74-7ce9-4d36-8938-58a1e2b2b49f-kube-api-access-jdlfc\") pod \"dns-operator-744455d44c-wpgg2\" (UID: \"644b4b74-7ce9-4d36-8938-58a1e2b2b49f\") " pod="openshift-dns-operator/dns-operator-744455d44c-wpgg2" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465470 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b6np\" (UniqueName: \"kubernetes.io/projected/b72d8baa-f3f8-4263-a36d-9741ad4243d5-kube-api-access-6b6np\") pod \"packageserver-d55dfcdfc-4jk5c\" (UID: \"b72d8baa-f3f8-4263-a36d-9741ad4243d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465493 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3-config-volume\") pod \"collect-profiles-29539260-g6qtn\" (UID: \"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465514 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9b2af767-57b1-4774-9668-6610e9ac1bb9-mountpoint-dir\") pod \"csi-hostpathplugin-64dsw\" (UID: \"9b2af767-57b1-4774-9668-6610e9ac1bb9\") " pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465532 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3f7932d3-c8c1-4f66-94fb-ea1a45b46889-certs\") pod \"machine-config-server-glj9p\" (UID: \"3f7932d3-c8c1-4f66-94fb-ea1a45b46889\") " pod="openshift-machine-config-operator/machine-config-server-glj9p" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465553 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b72d8baa-f3f8-4263-a36d-9741ad4243d5-webhook-cert\") pod \"packageserver-d55dfcdfc-4jk5c\" (UID: \"b72d8baa-f3f8-4263-a36d-9741ad4243d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465587 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/011a4c5f-1a18-4f0d-884f-43bb6477efb6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vgrb8\" (UID: \"011a4c5f-1a18-4f0d-884f-43bb6477efb6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465612 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e37e6dcb-be13-4787-8555-3ba1050f7b77-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gk6c6\" (UID: \"e37e6dcb-be13-4787-8555-3ba1050f7b77\") " pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465636 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d25464be-fe72-4409-a934-9e8c70542ed6-signing-key\") pod \"service-ca-9c57cc56f-r7d4f\" (UID: \"d25464be-fe72-4409-a934-9e8c70542ed6\") " pod="openshift-service-ca/service-ca-9c57cc56f-r7d4f" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465660 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdjxb\" (UniqueName: \"kubernetes.io/projected/9c283b49-5e58-4c99-97c2-d53ab428265f-kube-api-access-pdjxb\") pod \"dns-default-rjwhk\" (UID: \"9c283b49-5e58-4c99-97c2-d53ab428265f\") " pod="openshift-dns/dns-default-rjwhk" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465682 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ab8f156-05d7-47d8-b849-a49f1c5cf03b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-nc4gs\" (UID: \"9ab8f156-05d7-47d8-b849-a49f1c5cf03b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nc4gs" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465710 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzw4q\" (UniqueName: \"kubernetes.io/projected/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3-kube-api-access-lzw4q\") pod \"collect-profiles-29539260-g6qtn\" (UID: \"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465761 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9b2af767-57b1-4774-9668-6610e9ac1bb9-csi-data-dir\") pod \"csi-hostpathplugin-64dsw\" (UID: \"9b2af767-57b1-4774-9668-6610e9ac1bb9\") " pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465810 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkjx7\" (UniqueName: \"kubernetes.io/projected/9ab8f156-05d7-47d8-b849-a49f1c5cf03b-kube-api-access-bkjx7\") pod \"package-server-manager-789f6589d5-nc4gs\" (UID: \"9ab8f156-05d7-47d8-b849-a49f1c5cf03b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nc4gs" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.465840 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql5bq\" (UniqueName: \"kubernetes.io/projected/b4130507-2de2-48c2-9c3f-e9474aeca556-kube-api-access-ql5bq\") pod \"auto-csr-approver-29539270-q7hck\" (UID: \"b4130507-2de2-48c2-9c3f-e9474aeca556\") " pod="openshift-infra/auto-csr-approver-29539270-q7hck" Mar 01 09:11:34 crc kubenswrapper[4792]: E0301 09:11:34.466069 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:34.966052233 +0000 UTC m=+224.207931430 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.466860 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d25464be-fe72-4409-a934-9e8c70542ed6-signing-cabundle\") pod \"service-ca-9c57cc56f-r7d4f\" (UID: \"d25464be-fe72-4409-a934-9e8c70542ed6\") " pod="openshift-service-ca/service-ca-9c57cc56f-r7d4f" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.467487 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9c283b49-5e58-4c99-97c2-d53ab428265f-config-volume\") pod \"dns-default-rjwhk\" (UID: \"9c283b49-5e58-4c99-97c2-d53ab428265f\") " pod="openshift-dns/dns-default-rjwhk" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.467980 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9b2af767-57b1-4774-9668-6610e9ac1bb9-mountpoint-dir\") pod \"csi-hostpathplugin-64dsw\" (UID: \"9b2af767-57b1-4774-9668-6610e9ac1bb9\") " pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.470308 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5c679376-bb09-4944-b4ee-3710661612b5-profile-collector-cert\") pod \"catalog-operator-68c6474976-7l7sj\" (UID: \"5c679376-bb09-4944-b4ee-3710661612b5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.470331 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3-secret-volume\") pod \"collect-profiles-29539260-g6qtn\" (UID: \"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.470438 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9b2af767-57b1-4774-9668-6610e9ac1bb9-csi-data-dir\") pod \"csi-hostpathplugin-64dsw\" (UID: \"9b2af767-57b1-4774-9668-6610e9ac1bb9\") " pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.472228 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9c283b49-5e58-4c99-97c2-d53ab428265f-metrics-tls\") pod \"dns-default-rjwhk\" (UID: \"9c283b49-5e58-4c99-97c2-d53ab428265f\") " pod="openshift-dns/dns-default-rjwhk" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.472407 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5c679376-bb09-4944-b4ee-3710661612b5-srv-cert\") pod \"catalog-operator-68c6474976-7l7sj\" (UID: \"5c679376-bb09-4944-b4ee-3710661612b5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.472429 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9b2af767-57b1-4774-9668-6610e9ac1bb9-registration-dir\") pod \"csi-hostpathplugin-64dsw\" (UID: \"9b2af767-57b1-4774-9668-6610e9ac1bb9\") " pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.472737 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/04b15432-c193-4b0c-b527-df9a9b37c886-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ffd8l\" (UID: \"04b15432-c193-4b0c-b527-df9a9b37c886\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.474020 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b72d8baa-f3f8-4263-a36d-9741ad4243d5-webhook-cert\") pod \"packageserver-d55dfcdfc-4jk5c\" (UID: \"b72d8baa-f3f8-4263-a36d-9741ad4243d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.474690 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3f7932d3-c8c1-4f66-94fb-ea1a45b46889-certs\") pod \"machine-config-server-glj9p\" (UID: \"3f7932d3-c8c1-4f66-94fb-ea1a45b46889\") " pod="openshift-machine-config-operator/machine-config-server-glj9p" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.475382 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/04b15432-c193-4b0c-b527-df9a9b37c886-images\") pod \"machine-config-operator-74547568cd-ffd8l\" (UID: \"04b15432-c193-4b0c-b527-df9a9b37c886\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.475471 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9b2af767-57b1-4774-9668-6610e9ac1bb9-socket-dir\") pod \"csi-hostpathplugin-64dsw\" (UID: \"9b2af767-57b1-4774-9668-6610e9ac1bb9\") " pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.475593 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9b2af767-57b1-4774-9668-6610e9ac1bb9-plugins-dir\") pod \"csi-hostpathplugin-64dsw\" (UID: \"9b2af767-57b1-4774-9668-6610e9ac1bb9\") " pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.476077 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-nv4bp" event={"ID":"51683a24-edad-4808-b2ec-6a628bfdd937","Type":"ContainerStarted","Data":"9bdf109597e2093483ee18245241f28df0a563ea6bae665bae2821dc90890d2e"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.476385 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/011a4c5f-1a18-4f0d-884f-43bb6477efb6-trusted-ca\") pod \"ingress-operator-5b745b69d9-vgrb8\" (UID: \"011a4c5f-1a18-4f0d-884f-43bb6477efb6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.476572 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03389f1b-2d84-4b8b-879d-545498a154cc-config\") pod \"service-ca-operator-777779d784-6rmwm\" (UID: \"03389f1b-2d84-4b8b-879d-545498a154cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6rmwm" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.476769 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03389f1b-2d84-4b8b-879d-545498a154cc-serving-cert\") pod \"service-ca-operator-777779d784-6rmwm\" (UID: \"03389f1b-2d84-4b8b-879d-545498a154cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6rmwm" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.476990 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b72d8baa-f3f8-4263-a36d-9741ad4243d5-tmpfs\") pod \"packageserver-d55dfcdfc-4jk5c\" (UID: \"b72d8baa-f3f8-4263-a36d-9741ad4243d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.479002 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3f7932d3-c8c1-4f66-94fb-ea1a45b46889-node-bootstrap-token\") pod \"machine-config-server-glj9p\" (UID: \"3f7932d3-c8c1-4f66-94fb-ea1a45b46889\") " pod="openshift-machine-config-operator/machine-config-server-glj9p" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.479486 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/04b15432-c193-4b0c-b527-df9a9b37c886-proxy-tls\") pod \"machine-config-operator-74547568cd-ffd8l\" (UID: \"04b15432-c193-4b0c-b527-df9a9b37c886\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.479736 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3-config-volume\") pod \"collect-profiles-29539260-g6qtn\" (UID: \"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.479945 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cc60e4e1-1b94-4913-879c-fbd25ff314b9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qsc9d\" (UID: \"cc60e4e1-1b94-4913-879c-fbd25ff314b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.480535 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/644b4b74-7ce9-4d36-8938-58a1e2b2b49f-metrics-tls\") pod \"dns-operator-744455d44c-wpgg2\" (UID: \"644b4b74-7ce9-4d36-8938-58a1e2b2b49f\") " pod="openshift-dns-operator/dns-operator-744455d44c-wpgg2" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.480527 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ab8f156-05d7-47d8-b849-a49f1c5cf03b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-nc4gs\" (UID: \"9ab8f156-05d7-47d8-b849-a49f1c5cf03b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nc4gs" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.481674 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5kp" event={"ID":"77e0e285-570c-47bd-854e-538c9367486b","Type":"ContainerStarted","Data":"0a6766d4e40e2675b67316cc7d3c875fd936047845e1c3f01d0d7522bb2a7505"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.489222 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7b2lz" event={"ID":"47d3ee47-7c75-4321-8b9c-5e119a92a311","Type":"ContainerStarted","Data":"9203c066c00632815d880164f77515e0b915244aec43f169595ab08d34a5df06"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.493746 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" event={"ID":"020a8218-62f4-4abf-a8d2-fed602de5f7f","Type":"ContainerStarted","Data":"87d106e344fa51bb8e5e92cf97b7c6070e8daa571dd37784f078b0bfdb5ba165"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.495955 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e37e6dcb-be13-4787-8555-3ba1050f7b77-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gk6c6\" (UID: \"e37e6dcb-be13-4787-8555-3ba1050f7b77\") " pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.503049 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b72d8baa-f3f8-4263-a36d-9741ad4243d5-apiservice-cert\") pod \"packageserver-d55dfcdfc-4jk5c\" (UID: \"b72d8baa-f3f8-4263-a36d-9741ad4243d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.503548 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d25464be-fe72-4409-a934-9e8c70542ed6-signing-key\") pod \"service-ca-9c57cc56f-r7d4f\" (UID: \"d25464be-fe72-4409-a934-9e8c70542ed6\") " pod="openshift-service-ca/service-ca-9c57cc56f-r7d4f" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.504080 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/011a4c5f-1a18-4f0d-884f-43bb6477efb6-metrics-tls\") pod \"ingress-operator-5b745b69d9-vgrb8\" (UID: \"011a4c5f-1a18-4f0d-884f-43bb6477efb6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.505962 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" event={"ID":"9ee9e9d4-e788-41cb-b601-035551b5338c","Type":"ContainerStarted","Data":"b3f78f535e977cde4c2e5ec2a45cfe5751a4144dd17031be1b1eeada78b1af76"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.508727 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-qtg4x" event={"ID":"6d0571e3-5089-4157-a36a-25ecfe6a67f2","Type":"ContainerStarted","Data":"1448bc8f4ddb26ee5a66d7f86003329ab8a218e46fb4e7920989f0830ee402b2"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.509831 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd837cd0-c714-48e1-8771-cc6c419f7639-cert\") pod \"ingress-canary-dgh8q\" (UID: \"bd837cd0-c714-48e1-8771-cc6c419f7639\") " pod="openshift-ingress-canary/ingress-canary-dgh8q" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.511439 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-8qrq4" event={"ID":"eeacfd31-08e1-49e6-afda-95efa2d815d2","Type":"ContainerStarted","Data":"515004a2e31d7d7d7129680a5ec71f040c47d5513ac0abf1bde7b057757f277e"} Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.517274 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k789z\" (UniqueName: \"kubernetes.io/projected/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-kube-api-access-k789z\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.523779 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-smnq2" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.526954 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fb3b55fa-972b-4231-8445-bd4cd9a8b88b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8dwc4\" (UID: \"fb3b55fa-972b-4231-8445-bd4cd9a8b88b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dwc4" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.541624 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dwc4" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.571070 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.571258 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw279\" (UniqueName: \"kubernetes.io/projected/499393fc-abcf-4998-9e32-3d43a0b1e488-kube-api-access-bw279\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:34 crc kubenswrapper[4792]: E0301 09:11:34.571352 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:35.071335225 +0000 UTC m=+224.313214422 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.573933 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw279\" (UniqueName: \"kubernetes.io/projected/499393fc-abcf-4998-9e32-3d43a0b1e488-kube-api-access-bw279\") pod \"apiserver-76f77b778f-6lk5b\" (UID: \"499393fc-abcf-4998-9e32-3d43a0b1e488\") " pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.594838 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql5bq\" (UniqueName: \"kubernetes.io/projected/b4130507-2de2-48c2-9c3f-e9474aeca556-kube-api-access-ql5bq\") pod \"auto-csr-approver-29539270-q7hck\" (UID: \"b4130507-2de2-48c2-9c3f-e9474aeca556\") " pod="openshift-infra/auto-csr-approver-29539270-q7hck" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.604058 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539270-q7hck" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.615218 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdrrj\" (UniqueName: \"kubernetes.io/projected/3f7932d3-c8c1-4f66-94fb-ea1a45b46889-kube-api-access-rdrrj\") pod \"machine-config-server-glj9p\" (UID: \"3f7932d3-c8c1-4f66-94fb-ea1a45b46889\") " pod="openshift-machine-config-operator/machine-config-server-glj9p" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.634882 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52bbw\" (UniqueName: \"kubernetes.io/projected/5c679376-bb09-4944-b4ee-3710661612b5-kube-api-access-52bbw\") pod \"catalog-operator-68c6474976-7l7sj\" (UID: \"5c679376-bb09-4944-b4ee-3710661612b5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.638029 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbv6x\" (UniqueName: \"kubernetes.io/projected/6f49f99d-4119-400a-88d5-6fdf48da4d64-kube-api-access-gbv6x\") pod \"etcd-operator-b45778765-q92nw\" (UID: \"6f49f99d-4119-400a-88d5-6fdf48da4d64\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.638282 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e37e6dcb-be13-4787-8555-3ba1050f7b77-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gk6c6\" (UID: \"e37e6dcb-be13-4787-8555-3ba1050f7b77\") " pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.638777 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cc60e4e1-1b94-4913-879c-fbd25ff314b9-srv-cert\") pod \"olm-operator-6b444d44fb-qsc9d\" (UID: \"cc60e4e1-1b94-4913-879c-fbd25ff314b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.639024 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx4p9\" (UniqueName: \"kubernetes.io/projected/b1f7190f-8547-4938-8023-708e4891409d-kube-api-access-kx4p9\") pod \"cluster-image-registry-operator-dc59b4c8b-8dzrg\" (UID: \"b1f7190f-8547-4938-8023-708e4891409d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.640212 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a29641af-98a4-47ca-baca-7e933d7a00d5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-r9fc7\" (UID: \"a29641af-98a4-47ca-baca-7e933d7a00d5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9fc7" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.657947 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwl7l\" (UniqueName: \"kubernetes.io/projected/04b15432-c193-4b0c-b527-df9a9b37c886-kube-api-access-xwl7l\") pod \"machine-config-operator-74547568cd-ffd8l\" (UID: \"04b15432-c193-4b0c-b527-df9a9b37c886\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.672086 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:34 crc kubenswrapper[4792]: E0301 09:11:34.672261 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:35.172230368 +0000 UTC m=+224.414109565 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.672334 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: E0301 09:11:34.672649 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:35.172639438 +0000 UTC m=+224.414518635 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.674456 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvmn4\" (UniqueName: \"kubernetes.io/projected/cc60e4e1-1b94-4913-879c-fbd25ff314b9-kube-api-access-pvmn4\") pod \"olm-operator-6b444d44fb-qsc9d\" (UID: \"cc60e4e1-1b94-4913-879c-fbd25ff314b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.696846 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdjxb\" (UniqueName: \"kubernetes.io/projected/9c283b49-5e58-4c99-97c2-d53ab428265f-kube-api-access-pdjxb\") pod \"dns-default-rjwhk\" (UID: \"9c283b49-5e58-4c99-97c2-d53ab428265f\") " pod="openshift-dns/dns-default-rjwhk" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.717131 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-glj9p" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.719104 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzw4q\" (UniqueName: \"kubernetes.io/projected/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3-kube-api-access-lzw4q\") pod \"collect-profiles-29539260-g6qtn\" (UID: \"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.745058 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkjx7\" (UniqueName: \"kubernetes.io/projected/9ab8f156-05d7-47d8-b849-a49f1c5cf03b-kube-api-access-bkjx7\") pod \"package-server-manager-789f6589d5-nc4gs\" (UID: \"9ab8f156-05d7-47d8-b849-a49f1c5cf03b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nc4gs" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.761866 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.763330 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7pkt\" (UniqueName: \"kubernetes.io/projected/e37e6dcb-be13-4787-8555-3ba1050f7b77-kube-api-access-r7pkt\") pod \"marketplace-operator-79b997595-gk6c6\" (UID: \"e37e6dcb-be13-4787-8555-3ba1050f7b77\") " pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.771679 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.773610 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:34 crc kubenswrapper[4792]: E0301 09:11:34.774052 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:35.274030565 +0000 UTC m=+224.515909762 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.779738 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b6np\" (UniqueName: \"kubernetes.io/projected/b72d8baa-f3f8-4263-a36d-9741ad4243d5-kube-api-access-6b6np\") pod \"packageserver-d55dfcdfc-4jk5c\" (UID: \"b72d8baa-f3f8-4263-a36d-9741ad4243d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.814455 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdlfc\" (UniqueName: \"kubernetes.io/projected/644b4b74-7ce9-4d36-8938-58a1e2b2b49f-kube-api-access-jdlfc\") pod \"dns-operator-744455d44c-wpgg2\" (UID: \"644b4b74-7ce9-4d36-8938-58a1e2b2b49f\") " pod="openshift-dns-operator/dns-operator-744455d44c-wpgg2" Mar 01 09:11:34 crc kubenswrapper[4792]: W0301 09:11:34.816543 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f7932d3_c8c1_4f66_94fb_ea1a45b46889.slice/crio-6d467a8e74ffb6cb59877a3d9e8122a50b4dc3efb898dfefa40d1ca91fd8bdd6 WatchSource:0}: Error finding container 6d467a8e74ffb6cb59877a3d9e8122a50b4dc3efb898dfefa40d1ca91fd8bdd6: Status 404 returned error can't find the container with id 6d467a8e74ffb6cb59877a3d9e8122a50b4dc3efb898dfefa40d1ca91fd8bdd6 Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.819278 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/011a4c5f-1a18-4f0d-884f-43bb6477efb6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vgrb8\" (UID: \"011a4c5f-1a18-4f0d-884f-43bb6477efb6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.822507 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.831990 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9fc7" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.837366 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xv7w\" (UniqueName: \"kubernetes.io/projected/9b2af767-57b1-4774-9668-6610e9ac1bb9-kube-api-access-6xv7w\") pod \"csi-hostpathplugin-64dsw\" (UID: \"9b2af767-57b1-4774-9668-6610e9ac1bb9\") " pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.859156 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs8s7\" (UniqueName: \"kubernetes.io/projected/f3480f1b-eedb-4bc8-b40f-5c527869096a-kube-api-access-hs8s7\") pod \"migrator-59844c95c7-kxx8s\" (UID: \"f3480f1b-eedb-4bc8-b40f-5c527869096a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kxx8s" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.874780 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.874947 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bntn7\" (UniqueName: \"kubernetes.io/projected/011a4c5f-1a18-4f0d-884f-43bb6477efb6-kube-api-access-bntn7\") pod \"ingress-operator-5b745b69d9-vgrb8\" (UID: \"011a4c5f-1a18-4f0d-884f-43bb6477efb6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8" Mar 01 09:11:34 crc kubenswrapper[4792]: E0301 09:11:34.875150 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:35.375134854 +0000 UTC m=+224.617014051 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.891251 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nc4gs" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.892155 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.892247 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.899311 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dwc4"] Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.900043 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh4rn\" (UniqueName: \"kubernetes.io/projected/d25464be-fe72-4409-a934-9e8c70542ed6-kube-api-access-rh4rn\") pod \"service-ca-9c57cc56f-r7d4f\" (UID: \"d25464be-fe72-4409-a934-9e8c70542ed6\") " pod="openshift-service-ca/service-ca-9c57cc56f-r7d4f" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.914006 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.923842 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2z9s\" (UniqueName: \"kubernetes.io/projected/03389f1b-2d84-4b8b-879d-545498a154cc-kube-api-access-f2z9s\") pod \"service-ca-operator-777779d784-6rmwm\" (UID: \"03389f1b-2d84-4b8b-879d-545498a154cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6rmwm" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.924101 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.926031 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-smnq2"] Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.932883 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.935818 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppw5j\" (UniqueName: \"kubernetes.io/projected/bd837cd0-c714-48e1-8771-cc6c419f7639-kube-api-access-ppw5j\") pod \"ingress-canary-dgh8q\" (UID: \"bd837cd0-c714-48e1-8771-cc6c419f7639\") " pod="openshift-ingress-canary/ingress-canary-dgh8q" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.942588 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kxx8s" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.950091 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.957779 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539270-q7hck"] Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.959126 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rjwhk" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.967137 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-r7d4f" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.973200 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-wpgg2" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.975182 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:34 crc kubenswrapper[4792]: E0301 09:11:34.975382 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:35.475364791 +0000 UTC m=+224.717243988 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.985685 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn" Mar 01 09:11:34 crc kubenswrapper[4792]: I0301 09:11:34.987240 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-64dsw" Mar 01 09:11:34 crc kubenswrapper[4792]: W0301 09:11:34.998732 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd566570d_4f58_487b_b824_839792e88650.slice/crio-d313b7dea9bd493c603419846fc91ae18a2bb505f05af1d04f1b3989b7d817a2 WatchSource:0}: Error finding container d313b7dea9bd493c603419846fc91ae18a2bb505f05af1d04f1b3989b7d817a2: Status 404 returned error can't find the container with id d313b7dea9bd493c603419846fc91ae18a2bb505f05af1d04f1b3989b7d817a2 Mar 01 09:11:35 crc kubenswrapper[4792]: W0301 09:11:35.010646 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4130507_2de2_48c2_9c3f_e9474aeca556.slice/crio-ff027c882d14f4f304f09047fac83f66f0a08cd7a1d80eea6711ce8cef3e6448 WatchSource:0}: Error finding container ff027c882d14f4f304f09047fac83f66f0a08cd7a1d80eea6711ce8cef3e6448: Status 404 returned error can't find the container with id ff027c882d14f4f304f09047fac83f66f0a08cd7a1d80eea6711ce8cef3e6448 Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.016495 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.035103 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dgh8q" Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.082897 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:35 crc kubenswrapper[4792]: E0301 09:11:35.083548 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:35.583532694 +0000 UTC m=+224.825411891 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.172693 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6rmwm" Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.183499 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:35 crc kubenswrapper[4792]: E0301 09:11:35.183726 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:35.68371246 +0000 UTC m=+224.925591657 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.292461 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-q92nw"] Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.293351 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:35 crc kubenswrapper[4792]: E0301 09:11:35.293811 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:35.79379749 +0000 UTC m=+225.035676687 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.371645 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg"] Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.395407 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:35 crc kubenswrapper[4792]: E0301 09:11:35.395685 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:35.895670667 +0000 UTC m=+225.137549864 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.475841 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9fc7"] Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.494124 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.502843 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:35 crc kubenswrapper[4792]: E0301 09:11:35.503370 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:36.003357408 +0000 UTC m=+225.245236605 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.539259 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-877gr" event={"ID":"0a5ad85c-19b5-432d-aa36-d0db74e44744","Type":"ContainerStarted","Data":"3bb35e6ef9f0da7b40811ef29e32e353346635d604b9b39b0dc1a4ec5904ca0a"} Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.552163 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-8qrq4" event={"ID":"eeacfd31-08e1-49e6-afda-95efa2d815d2","Type":"ContainerStarted","Data":"9eb47924416a8f2a59a0c453bb2ab1e7ec87e083accfd408d3ab0c81005fb1cc"} Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.552955 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-8qrq4" Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.556653 4792 patch_prober.go:28] interesting pod/console-operator-58897d9998-8qrq4 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/readyz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.556691 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-8qrq4" podUID="eeacfd31-08e1-49e6-afda-95efa2d815d2" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/readyz\": dial tcp 10.217.0.31:8443: connect: connection refused" Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.556989 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539270-q7hck" event={"ID":"b4130507-2de2-48c2-9c3f-e9474aeca556","Type":"ContainerStarted","Data":"ff027c882d14f4f304f09047fac83f66f0a08cd7a1d80eea6711ce8cef3e6448"} Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.562143 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" event={"ID":"9ee9e9d4-e788-41cb-b601-035551b5338c","Type":"ContainerStarted","Data":"14601b3d86899c0b127c11ea852ecc09b467fb320b1f1a289b6ea1819e6deeed"} Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.567481 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dwc4" event={"ID":"fb3b55fa-972b-4231-8445-bd4cd9a8b88b","Type":"ContainerStarted","Data":"69eaae17ca0c3b10dd436fcaa8b0dceed744cc2c6da2e2950888ab59627b5abd"} Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.574122 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-glj9p" event={"ID":"3f7932d3-c8c1-4f66-94fb-ea1a45b46889","Type":"ContainerStarted","Data":"6d467a8e74ffb6cb59877a3d9e8122a50b4dc3efb898dfefa40d1ca91fd8bdd6"} Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.618535 4792 generic.go:334] "Generic (PLEG): container finished" podID="9d95b2fd-64be-4688-a596-c41bb31cb9c4" containerID="828da0440d2ef9c9dc733beff09cbe9cffe6d7dcfa7675ba2e268a387a977a9f" exitCode=0 Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.618622 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" event={"ID":"9d95b2fd-64be-4688-a596-c41bb31cb9c4","Type":"ContainerDied","Data":"828da0440d2ef9c9dc733beff09cbe9cffe6d7dcfa7675ba2e268a387a977a9f"} Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.623691 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:35 crc kubenswrapper[4792]: E0301 09:11:35.624022 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:36.124005318 +0000 UTC m=+225.365884515 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.698848 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dbmff" event={"ID":"d3e5ed20-a91a-4d9b-a42c-0bf43b8b9842","Type":"ContainerStarted","Data":"ce6c0cb2394b93a60649419ffb968cb456bb229f9d79aeb5525f4e38a90d0561"} Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.724578 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-qtg4x" event={"ID":"6d0571e3-5089-4157-a36a-25ecfe6a67f2","Type":"ContainerStarted","Data":"107f80305c6b81ce3b32773a2d4c2df641bfa3c110285efe9782296b8c374688"} Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.726110 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:35 crc kubenswrapper[4792]: E0301 09:11:35.729499 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:36.229484535 +0000 UTC m=+225.471363722 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.733305 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9smfd" event={"ID":"e0b63d94-59de-45da-8058-89714bea7a90","Type":"ContainerStarted","Data":"435c0c14fbeae7027911e02febcca02089f1b06f392117a8649865a86018cf8c"} Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.761262 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-kxx8s"] Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.768092 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-wpgg2"] Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.783876 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dbmff" podStartSLOduration=163.783853854 podStartE2EDuration="2m43.783853854s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:35.775131339 +0000 UTC m=+225.017010536" watchObservedRunningTime="2026-03-01 09:11:35.783853854 +0000 UTC m=+225.025733051" Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.801719 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" event={"ID":"8578f8dc-143c-423c-b62b-b3190444bafd","Type":"ContainerStarted","Data":"c13739edba999be4dbdb0675da20f693f5c2f873f236f49fe221573f802f1672"} Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.802123 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.809806 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" event={"ID":"020a8218-62f4-4abf-a8d2-fed602de5f7f","Type":"ContainerStarted","Data":"63337be8a65e2fc4ccac6c7a4ef78cbcdcc5f66cc06d8b699c08445a9271a940"} Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.809867 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.811663 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zrzcg" event={"ID":"86788093-42e5-4fa0-9595-97a910e6557e","Type":"ContainerStarted","Data":"80316d2117c800289193f2a4dcfeab49966269af79c4c3f4ca03bd69d11da184"} Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.816885 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" podStartSLOduration=163.816872576 podStartE2EDuration="2m43.816872576s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:35.816844196 +0000 UTC m=+225.058723393" watchObservedRunningTime="2026-03-01 09:11:35.816872576 +0000 UTC m=+225.058751773" Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.817323 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" event={"ID":"1f432b26-7417-4b71-a63a-5cb9a142bd43","Type":"ContainerStarted","Data":"5d5f80f7330bb6afa57c8bfae40cb90db4a709e518e9787a06868810dc3ed802"} Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.826974 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:35 crc kubenswrapper[4792]: E0301 09:11:35.827232 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:36.327204351 +0000 UTC m=+225.569083548 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.835675 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-nv4bp" event={"ID":"51683a24-edad-4808-b2ec-6a628bfdd937","Type":"ContainerStarted","Data":"23990c96fef4c71bf759430f4b2c3b47ab3757999edb6510973ca0d075879c46"} Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.843013 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5kp" event={"ID":"77e0e285-570c-47bd-854e-538c9367486b","Type":"ContainerStarted","Data":"3960184a2f5a6701136e24eaaa9e26902f8b3063a31ad16fe3f57ca6c4c15e29"} Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.847917 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7b2lz" event={"ID":"47d3ee47-7c75-4321-8b9c-5e119a92a311","Type":"ContainerStarted","Data":"7d54ee435dd185de59e9d4fd069cd0c74f0b8dc2e24b0519e8bb2b8344d3f21e"} Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.851407 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-smnq2" event={"ID":"d566570d-4f58-487b-b824-839792e88650","Type":"ContainerStarted","Data":"d313b7dea9bd493c603419846fc91ae18a2bb505f05af1d04f1b3989b7d817a2"} Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.854851 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-wxl8v" Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.899123 4792 patch_prober.go:28] interesting pod/downloads-7954f5f757-wxl8v container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" start-of-body= Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.899177 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wxl8v" podUID="6cc55bdf-6c0f-4d35-879f-c64c2dc4897c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.899419 4792 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-wl9zt container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.899435 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" podUID="8578f8dc-143c-423c-b62b-b3190444bafd" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.899484 4792 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-prqqp container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.16:6443/healthz\": dial tcp 10.217.0.16:6443: connect: connection refused" start-of-body= Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.899496 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" podUID="020a8218-62f4-4abf-a8d2-fed602de5f7f" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.16:6443/healthz\": dial tcp 10.217.0.16:6443: connect: connection refused" Mar 01 09:11:35 crc kubenswrapper[4792]: W0301 09:11:35.918080 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3480f1b_eedb_4bc8_b40f_5c527869096a.slice/crio-4134ab0e66c0f4d4a6c95c020bf21196fa893d296dd143d28e5c16f18fad1e82 WatchSource:0}: Error finding container 4134ab0e66c0f4d4a6c95c020bf21196fa893d296dd143d28e5c16f18fad1e82: Status 404 returned error can't find the container with id 4134ab0e66c0f4d4a6c95c020bf21196fa893d296dd143d28e5c16f18fad1e82 Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.938379 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lk5qk" podStartSLOduration=164.938354307 podStartE2EDuration="2m44.938354307s" podCreationTimestamp="2026-03-01 09:08:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:35.861388282 +0000 UTC m=+225.103267469" watchObservedRunningTime="2026-03-01 09:11:35.938354307 +0000 UTC m=+225.180233514" Mar 01 09:11:35 crc kubenswrapper[4792]: I0301 09:11:35.943114 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:35 crc kubenswrapper[4792]: E0301 09:11:35.949364 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:36.449348028 +0000 UTC m=+225.691227225 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.044969 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:36 crc kubenswrapper[4792]: E0301 09:11:36.046109 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:36.546094209 +0000 UTC m=+225.787973406 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.166111 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.166458 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9smfd" podStartSLOduration=164.166442312 podStartE2EDuration="2m44.166442312s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:36.138140575 +0000 UTC m=+225.380019772" watchObservedRunningTime="2026-03-01 09:11:36.166442312 +0000 UTC m=+225.408321509" Mar 01 09:11:36 crc kubenswrapper[4792]: E0301 09:11:36.178678 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:36.678637772 +0000 UTC m=+225.920517069 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.245441 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.251575 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-s68hg" podStartSLOduration=164.251554698 podStartE2EDuration="2m44.251554698s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:36.224125772 +0000 UTC m=+225.466004969" watchObservedRunningTime="2026-03-01 09:11:36.251554698 +0000 UTC m=+225.493433895" Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.271416 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:36 crc kubenswrapper[4792]: E0301 09:11:36.271745 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:36.771729734 +0000 UTC m=+226.013608931 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.373819 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:36 crc kubenswrapper[4792]: E0301 09:11:36.374353 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:36.87434197 +0000 UTC m=+226.116221167 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.410037 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d"] Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.485457 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:36 crc kubenswrapper[4792]: E0301 09:11:36.485891 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:36.985874096 +0000 UTC m=+226.227753293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.549504 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-6lk5b"] Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.572011 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zk9bv" podStartSLOduration=165.571996856 podStartE2EDuration="2m45.571996856s" podCreationTimestamp="2026-03-01 09:08:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:36.567046554 +0000 UTC m=+225.808925751" watchObservedRunningTime="2026-03-01 09:11:36.571996856 +0000 UTC m=+225.813876053" Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.587419 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:36 crc kubenswrapper[4792]: E0301 09:11:36.587850 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:37.087833156 +0000 UTC m=+226.329712353 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.603410 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.610787 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nc4gs"] Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.618309 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:36 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:36 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:36 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.618364 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.632046 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-qtg4x" podStartSLOduration=164.632027764 podStartE2EDuration="2m44.632027764s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:36.63064046 +0000 UTC m=+225.872519657" watchObservedRunningTime="2026-03-01 09:11:36.632027764 +0000 UTC m=+225.873906961" Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.658235 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-r7d4f"] Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.689130 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:36 crc kubenswrapper[4792]: E0301 09:11:36.689393 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:37.189375696 +0000 UTC m=+226.431254893 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.689632 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:36 crc kubenswrapper[4792]: E0301 09:11:36.689937 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:37.189895189 +0000 UTC m=+226.431774386 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.754670 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-8qrq4" podStartSLOduration=165.754650593 podStartE2EDuration="2m45.754650593s" podCreationTimestamp="2026-03-01 09:08:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:36.739936081 +0000 UTC m=+225.981815278" watchObservedRunningTime="2026-03-01 09:11:36.754650593 +0000 UTC m=+225.996529790" Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.756812 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8"] Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.790883 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:36 crc kubenswrapper[4792]: E0301 09:11:36.791248 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:37.291233174 +0000 UTC m=+226.533112371 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.830144 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj"] Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.834707 4792 ???:1] "http: TLS handshake error from 192.168.126.11:50278: no serving certificate available for the kubelet" Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.881144 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l"] Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.891961 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:36 crc kubenswrapper[4792]: E0301 09:11:36.892245 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:37.39223352 +0000 UTC m=+226.634112717 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.909271 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-wpgg2" event={"ID":"644b4b74-7ce9-4d36-8938-58a1e2b2b49f","Type":"ContainerStarted","Data":"deb4cb205db772c57f32cd07de729e7db366c1ef515aecd7666695017d360fb0"} Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.920996 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c"] Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.969510 4792 ???:1] "http: TLS handshake error from 192.168.126.11:50294: no serving certificate available for the kubelet" Mar 01 09:11:36 crc kubenswrapper[4792]: I0301 09:11:36.993253 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:36 crc kubenswrapper[4792]: E0301 09:11:36.993877 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:37.493846022 +0000 UTC m=+226.735725219 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.006464 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-64dsw"] Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.041664 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d" event={"ID":"cc60e4e1-1b94-4913-879c-fbd25ff314b9","Type":"ContainerStarted","Data":"1e8deb08c9230c5fafd4cb5d4f271455cb951a1172b27605143da8179962768a"} Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.042543 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-6rmwm"] Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.094325 4792 ???:1] "http: TLS handshake error from 192.168.126.11:50296: no serving certificate available for the kubelet" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.095324 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:37 crc kubenswrapper[4792]: E0301 09:11:37.096706 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:37.596692644 +0000 UTC m=+226.838571841 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.122040 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9fc7" event={"ID":"a29641af-98a4-47ca-baca-7e933d7a00d5","Type":"ContainerStarted","Data":"a669e0b4482e33ffdb8d53bbc5ad2cf935a146e10be2babeb35ee8efc1288240"} Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.148471 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn"] Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.164560 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-r7d4f" event={"ID":"d25464be-fe72-4409-a934-9e8c70542ed6","Type":"ContainerStarted","Data":"bceb9c902218cbbaf0ce238fdc30fb7b9ec3eb272099123b2052cead5b010326"} Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.199257 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:37 crc kubenswrapper[4792]: E0301 09:11:37.199669 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:37.699652948 +0000 UTC m=+226.941532145 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.205163 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-877gr" event={"ID":"0a5ad85c-19b5-432d-aa36-d0db74e44744","Type":"ContainerStarted","Data":"9e342f117a951fbdc417be13b431b7e6f53f8a26812ff4e26f3b920583eaf4fa"} Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.217425 4792 ???:1] "http: TLS handshake error from 192.168.126.11:50310: no serving certificate available for the kubelet" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.217822 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-glj9p" event={"ID":"3f7932d3-c8c1-4f66-94fb-ea1a45b46889","Type":"ContainerStarted","Data":"4ccee4d48a72de23a14c4744d4a706ffdc8caaae4c1f3e4c7c88707cc64282b1"} Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.228817 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-wxl8v" podStartSLOduration=165.228797466 podStartE2EDuration="2m45.228797466s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:37.18349433 +0000 UTC m=+226.425373527" watchObservedRunningTime="2026-03-01 09:11:37.228797466 +0000 UTC m=+226.470676663" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.229385 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-zrzcg" podStartSLOduration=165.22937988 podStartE2EDuration="2m45.22937988s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:37.22570912 +0000 UTC m=+226.467588317" watchObservedRunningTime="2026-03-01 09:11:37.22937988 +0000 UTC m=+226.471259077" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.249174 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dwc4" event={"ID":"fb3b55fa-972b-4231-8445-bd4cd9a8b88b","Type":"ContainerStarted","Data":"d0f8b4226a10c737a4e4e1fb0342d829ed6a9a55651af6451d89aad4be0eb6c3"} Mar 01 09:11:37 crc kubenswrapper[4792]: W0301 09:11:37.270761 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c1a0aad_45a6_45d3_bc5d_bbbf2e4fdcc3.slice/crio-e332fd58bac36920eb992b11914f683b07598dd8216c452a72973706723d6019 WatchSource:0}: Error finding container e332fd58bac36920eb992b11914f683b07598dd8216c452a72973706723d6019: Status 404 returned error can't find the container with id e332fd58bac36920eb992b11914f683b07598dd8216c452a72973706723d6019 Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.281129 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gk6c6"] Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.290689 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-tswcj" podStartSLOduration=166.290668899 podStartE2EDuration="2m46.290668899s" podCreationTimestamp="2026-03-01 09:08:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:37.289259594 +0000 UTC m=+226.531138791" watchObservedRunningTime="2026-03-01 09:11:37.290668899 +0000 UTC m=+226.532548086" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.296968 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dgh8q"] Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.304507 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:37 crc kubenswrapper[4792]: E0301 09:11:37.305667 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:37.805653048 +0000 UTC m=+227.047532245 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.331769 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rjwhk"] Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.339977 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kxx8s" event={"ID":"f3480f1b-eedb-4bc8-b40f-5c527869096a","Type":"ContainerStarted","Data":"4134ab0e66c0f4d4a6c95c020bf21196fa893d296dd143d28e5c16f18fad1e82"} Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.357560 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" podStartSLOduration=165.357542185 podStartE2EDuration="2m45.357542185s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:37.338894106 +0000 UTC m=+226.580773303" watchObservedRunningTime="2026-03-01 09:11:37.357542185 +0000 UTC m=+226.599421382" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.366001 4792 ???:1] "http: TLS handshake error from 192.168.126.11:50316: no serving certificate available for the kubelet" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.376683 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg" event={"ID":"b1f7190f-8547-4938-8023-708e4891409d","Type":"ContainerStarted","Data":"a7da6efe1203d3322dcc417e6fdf30da39440d2bce52db39523da3be26f9f320"} Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.406579 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:37 crc kubenswrapper[4792]: E0301 09:11:37.407712 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:37.907664049 +0000 UTC m=+227.149543246 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.427534 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7b2lz" podStartSLOduration=165.427519298 podStartE2EDuration="2m45.427519298s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:37.418968867 +0000 UTC m=+226.660848064" watchObservedRunningTime="2026-03-01 09:11:37.427519298 +0000 UTC m=+226.669398495" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.447318 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" podStartSLOduration=166.447301875 podStartE2EDuration="2m46.447301875s" podCreationTimestamp="2026-03-01 09:08:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:37.444459405 +0000 UTC m=+226.686338602" watchObservedRunningTime="2026-03-01 09:11:37.447301875 +0000 UTC m=+226.689181072" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.474535 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-nv4bp" podStartSLOduration=165.474520015 podStartE2EDuration="2m45.474520015s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:37.47268005 +0000 UTC m=+226.714559247" watchObservedRunningTime="2026-03-01 09:11:37.474520015 +0000 UTC m=+226.716399212" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.499177 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-smnq2" event={"ID":"d566570d-4f58-487b-b824-839792e88650","Type":"ContainerStarted","Data":"b437d533301ec1a8bf145b28fa75526b9244e8515ab202b1f8f1ae9e07990c81"} Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.499743 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-glj9p" podStartSLOduration=6.499726045 podStartE2EDuration="6.499726045s" podCreationTimestamp="2026-03-01 09:11:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:37.499120781 +0000 UTC m=+226.740999978" watchObservedRunningTime="2026-03-01 09:11:37.499726045 +0000 UTC m=+226.741605242" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.507598 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:37 crc kubenswrapper[4792]: E0301 09:11:37.511362 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:38.011331181 +0000 UTC m=+227.253210378 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.516101 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" event={"ID":"499393fc-abcf-4998-9e32-3d43a0b1e488","Type":"ContainerStarted","Data":"038db8caba2d61180a1409412260d624f871b2f05eaa4b1054a8c506f49680ee"} Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.539606 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8dwc4" podStartSLOduration=165.539589497 podStartE2EDuration="2m45.539589497s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:37.538975432 +0000 UTC m=+226.780854629" watchObservedRunningTime="2026-03-01 09:11:37.539589497 +0000 UTC m=+226.781468694" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.589329 4792 ???:1] "http: TLS handshake error from 192.168.126.11:50318: no serving certificate available for the kubelet" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.590480 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg" podStartSLOduration=165.590464849 podStartE2EDuration="2m45.590464849s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:37.589744992 +0000 UTC m=+226.831624189" watchObservedRunningTime="2026-03-01 09:11:37.590464849 +0000 UTC m=+226.832344036" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.591016 4792 generic.go:334] "Generic (PLEG): container finished" podID="9ee9e9d4-e788-41cb-b601-035551b5338c" containerID="14601b3d86899c0b127c11ea852ecc09b467fb320b1f1a289b6ea1819e6deeed" exitCode=0 Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.591109 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" event={"ID":"9ee9e9d4-e788-41cb-b601-035551b5338c","Type":"ContainerDied","Data":"14601b3d86899c0b127c11ea852ecc09b467fb320b1f1a289b6ea1819e6deeed"} Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.591749 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.603350 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nc4gs" event={"ID":"9ab8f156-05d7-47d8-b849-a49f1c5cf03b","Type":"ContainerStarted","Data":"0ee72213a44cb66137fb2d70f1a1b01a4afa69cb60d872ae25778f630b635928"} Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.613638 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:37 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:37 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:37 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.613696 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.614072 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:37 crc kubenswrapper[4792]: E0301 09:11:37.614443 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:38.114422649 +0000 UTC m=+227.356301856 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.637517 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" event={"ID":"6f49f99d-4119-400a-88d5-6fdf48da4d64","Type":"ContainerStarted","Data":"d9feede3d11684c685f93f46f379d2cdbd7d87e0cfb7e43c4bccc7132411f21d"} Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.655581 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-877gr" podStartSLOduration=165.655568262 podStartE2EDuration="2m45.655568262s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:37.653386648 +0000 UTC m=+226.895265845" watchObservedRunningTime="2026-03-01 09:11:37.655568262 +0000 UTC m=+226.897447449" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.684495 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" podStartSLOduration=165.684476294 podStartE2EDuration="2m45.684476294s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:37.682950726 +0000 UTC m=+226.924829923" watchObservedRunningTime="2026-03-01 09:11:37.684476294 +0000 UTC m=+226.926355491" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.698774 4792 ???:1] "http: TLS handshake error from 192.168.126.11:50322: no serving certificate available for the kubelet" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.709123 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8" event={"ID":"011a4c5f-1a18-4f0d-884f-43bb6477efb6","Type":"ContainerStarted","Data":"2dc75b9ebc33c1710fdb48d0d0e0e4568bf56596b98b86a29ae12f220ad2dc9a"} Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.710110 4792 patch_prober.go:28] interesting pod/downloads-7954f5f757-wxl8v container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" start-of-body= Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.710152 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wxl8v" podUID="6cc55bdf-6c0f-4d35-879f-c64c2dc4897c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.715183 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.728675 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:37 crc kubenswrapper[4792]: E0301 09:11:37.728799 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:38.228776884 +0000 UTC m=+227.470656081 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.732880 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" podStartSLOduration=166.732860715 podStartE2EDuration="2m46.732860715s" podCreationTimestamp="2026-03-01 09:08:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:37.731610164 +0000 UTC m=+226.973489381" watchObservedRunningTime="2026-03-01 09:11:37.732860715 +0000 UTC m=+226.974739912" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.779575 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.816534 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:37 crc kubenswrapper[4792]: E0301 09:11:37.816998 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:38.316982476 +0000 UTC m=+227.558861673 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.819869 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:37 crc kubenswrapper[4792]: E0301 09:11:37.820199 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:38.320189595 +0000 UTC m=+227.562068792 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.853279 4792 ???:1] "http: TLS handshake error from 192.168.126.11:50338: no serving certificate available for the kubelet" Mar 01 09:11:37 crc kubenswrapper[4792]: I0301 09:11:37.926418 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:37 crc kubenswrapper[4792]: E0301 09:11:37.926770 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:38.426755718 +0000 UTC m=+227.668634915 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.031540 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:38 crc kubenswrapper[4792]: E0301 09:11:38.031790 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:38.531779334 +0000 UTC m=+227.773658531 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.132586 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:38 crc kubenswrapper[4792]: E0301 09:11:38.138789 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:38.638765128 +0000 UTC m=+227.880644325 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.235608 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:38 crc kubenswrapper[4792]: E0301 09:11:38.235983 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:38.735971481 +0000 UTC m=+227.977850678 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.336291 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:38 crc kubenswrapper[4792]: E0301 09:11:38.336463 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:38.836430744 +0000 UTC m=+228.078309941 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.336969 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:38 crc kubenswrapper[4792]: E0301 09:11:38.337326 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:38.837313836 +0000 UTC m=+228.079193033 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.437511 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:38 crc kubenswrapper[4792]: E0301 09:11:38.437942 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:38.937923562 +0000 UTC m=+228.179802759 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.525205 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wl9zt"] Mar 01 09:11:38 crc kubenswrapper[4792]: E0301 09:11:38.543279 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:39.043267936 +0000 UTC m=+228.285147133 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.543001 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.608548 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:38 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:38 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:38 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.608823 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.625997 4792 ???:1] "http: TLS handshake error from 192.168.126.11:37460: no serving certificate available for the kubelet" Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.628406 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62"] Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.645340 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:38 crc kubenswrapper[4792]: E0301 09:11:38.645694 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:39.145665607 +0000 UTC m=+228.387544794 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.721466 4792 patch_prober.go:28] interesting pod/console-operator-58897d9998-8qrq4 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.721505 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-8qrq4" podUID="eeacfd31-08e1-49e6-afda-95efa2d815d2" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.745338 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6rmwm" event={"ID":"03389f1b-2d84-4b8b-879d-545498a154cc","Type":"ContainerStarted","Data":"be5dca9df6311edb2af7a577d893ee5693464113e3b6cc2ba2321f2d46420c86"} Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.745400 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6rmwm" event={"ID":"03389f1b-2d84-4b8b-879d-545498a154cc","Type":"ContainerStarted","Data":"dfb9fd1ab6ea1a0f91d85ac829d57bd567353f35088f191e271584b0ccf4e0ec"} Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.746645 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:38 crc kubenswrapper[4792]: E0301 09:11:38.747033 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:39.247020752 +0000 UTC m=+228.488899949 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.762890 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-q92nw" event={"ID":"6f49f99d-4119-400a-88d5-6fdf48da4d64","Type":"ContainerStarted","Data":"b18c2370e5e8fce7f5f9bb58fb2c42790591fd6902256b0182fd699718ea3bb4"} Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.773208 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nc4gs" event={"ID":"9ab8f156-05d7-47d8-b849-a49f1c5cf03b","Type":"ContainerStarted","Data":"5c5a2b17f732f695da3d4866b48be8b4675888ddeed2320710da5fc39d06bbdb"} Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.775180 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-wpgg2" event={"ID":"644b4b74-7ce9-4d36-8938-58a1e2b2b49f","Type":"ContainerStarted","Data":"1d1e2d22704d416cbd49572f2e0f199e4f1cbc9524b80c29bc32d87307e59b79"} Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.776523 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-nv4bp" event={"ID":"51683a24-edad-4808-b2ec-6a628bfdd937","Type":"ContainerStarted","Data":"7330b89e50cc3be0cf56d7d80385f1422cd9b492692ed2007be996954dfaf2cd"} Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.792252 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" event={"ID":"e37e6dcb-be13-4787-8555-3ba1050f7b77","Type":"ContainerStarted","Data":"8c78eddf897c2741ce992ebcf0cd416c60d059664ef1623df2239b0550809bd0"} Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.792301 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" event={"ID":"e37e6dcb-be13-4787-8555-3ba1050f7b77","Type":"ContainerStarted","Data":"5fe7291196c34fc28ce9dd0b3ec6175bb85545475a2aafe649770e45dc76a617"} Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.792673 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.799755 4792 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-gk6c6 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.799810 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" podUID="e37e6dcb-be13-4787-8555-3ba1050f7b77" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.817199 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6rmwm" podStartSLOduration=166.817180909 podStartE2EDuration="2m46.817180909s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:38.814839811 +0000 UTC m=+228.056719008" watchObservedRunningTime="2026-03-01 09:11:38.817180909 +0000 UTC m=+228.059060106" Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.833592 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-64dsw" event={"ID":"9b2af767-57b1-4774-9668-6610e9ac1bb9","Type":"ContainerStarted","Data":"abaead56598a20893bfa16ce2ecd0037f3a66d16ae9b1a77ca85a4700fdf9590"} Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.840062 4792 generic.go:334] "Generic (PLEG): container finished" podID="499393fc-abcf-4998-9e32-3d43a0b1e488" containerID="d19e02d3394aff309dd88dd48aab49226b866dfa35122864b9992aeb050f8afd" exitCode=0 Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.840114 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" event={"ID":"499393fc-abcf-4998-9e32-3d43a0b1e488","Type":"ContainerDied","Data":"d19e02d3394aff309dd88dd48aab49226b866dfa35122864b9992aeb050f8afd"} Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.845459 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dgh8q" event={"ID":"bd837cd0-c714-48e1-8771-cc6c419f7639","Type":"ContainerStarted","Data":"1212105a46de0acf3cd80080586379061d3be2534ce8e94ecb5417d6b6b7e92c"} Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.850685 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:38 crc kubenswrapper[4792]: E0301 09:11:38.851645 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:39.351627427 +0000 UTC m=+228.593506624 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.867709 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dzrg" event={"ID":"b1f7190f-8547-4938-8023-708e4891409d","Type":"ContainerStarted","Data":"925c80281e9b04110590674a902f3a43bab170a2e87b3fb5f642b3323460bb96"} Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.899248 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" event={"ID":"9d95b2fd-64be-4688-a596-c41bb31cb9c4","Type":"ContainerStarted","Data":"60179549a9f0b5dbaf0c97716a3f646403d0000e866b3ec196196bebb111fdf6"} Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.904605 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj" event={"ID":"5c679376-bb09-4944-b4ee-3710661612b5","Type":"ContainerStarted","Data":"b4afdc9e206052462c89c3fd37ca081334d6d0b37f35a823344ae1d150a66765"} Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.904626 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj" event={"ID":"5c679376-bb09-4944-b4ee-3710661612b5","Type":"ContainerStarted","Data":"65210aeaea8be68edc35f13511f341f60e638663e33c4559312f6d0fb836a124"} Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.905437 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj" Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.906864 4792 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-7l7sj container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.906925 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj" podUID="5c679376-bb09-4944-b4ee-3710661612b5" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.952936 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.953275 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kxx8s" event={"ID":"f3480f1b-eedb-4bc8-b40f-5c527869096a","Type":"ContainerStarted","Data":"ed567a0606fbc6632e6937fff7b5a0bb538e5c37024978b1b514629d510fa5e4"} Mar 01 09:11:38 crc kubenswrapper[4792]: E0301 09:11:38.953677 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:39.453658308 +0000 UTC m=+228.695537595 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.975290 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9fc7" event={"ID":"a29641af-98a4-47ca-baca-7e933d7a00d5","Type":"ContainerStarted","Data":"310e2042f2dd50873aa63a05decd48e217cbb9950371820b3adbbd2fa7609208"} Mar 01 09:11:38 crc kubenswrapper[4792]: I0301 09:11:38.989571 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" podStartSLOduration=166.989548531 podStartE2EDuration="2m46.989548531s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:38.896274646 +0000 UTC m=+228.138153833" watchObservedRunningTime="2026-03-01 09:11:38.989548531 +0000 UTC m=+228.231427728" Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.003470 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5kp" event={"ID":"77e0e285-570c-47bd-854e-538c9367486b","Type":"ContainerStarted","Data":"cca2ec9af1276f9b1ad87b56b9c2e2a8e7900da42011183a721ef128a1b49dc1"} Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.015926 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d" event={"ID":"cc60e4e1-1b94-4913-879c-fbd25ff314b9","Type":"ContainerStarted","Data":"20fb1032019b139ae4a359755a225f6fce14f8690da82b73371ec721bf4673cb"} Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.017302 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d" Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.018216 4792 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-qsc9d container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.018282 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d" podUID="cc60e4e1-1b94-4913-879c-fbd25ff314b9" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.026561 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" event={"ID":"b72d8baa-f3f8-4263-a36d-9741ad4243d5","Type":"ContainerStarted","Data":"1ab007922347514bc5d518da115315068e031c931fbd22701561beac17eb7fe2"} Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.026629 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" event={"ID":"b72d8baa-f3f8-4263-a36d-9741ad4243d5","Type":"ContainerStarted","Data":"93d57c76821b6e6e039439288be1d0054cddb9b84b3c596c23b2a851a1266b29"} Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.027212 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.028256 4792 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-4jk5c container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.028294 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" podUID="b72d8baa-f3f8-4263-a36d-9741ad4243d5" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.039050 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" event={"ID":"9ee9e9d4-e788-41cb-b601-035551b5338c","Type":"ContainerStarted","Data":"3dcbc32a302bd6ebdeb601d73aa8576d35bceb3405d2c12ddb512b11a49f528f"} Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.054518 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:39 crc kubenswrapper[4792]: E0301 09:11:39.055568 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:39.555541256 +0000 UTC m=+228.797420453 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.059088 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l" event={"ID":"04b15432-c193-4b0c-b527-df9a9b37c886","Type":"ContainerStarted","Data":"36808e59f4f8c78214d7893f962d00204ea81733bd6ea7647736d9ad3a1e0d3c"} Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.059136 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l" event={"ID":"04b15432-c193-4b0c-b527-df9a9b37c886","Type":"ContainerStarted","Data":"118cde52251d3c7901a6a4aaffb7eca4c854199aa8f158dcce2329331bd9556e"} Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.106594 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8" event={"ID":"011a4c5f-1a18-4f0d-884f-43bb6477efb6","Type":"ContainerStarted","Data":"6921e70e382ff7e5603326ccb1dd2ca4e61c2b0663734c079c837b7a00b56460"} Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.138768 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn" event={"ID":"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3","Type":"ContainerStarted","Data":"db93ce9b530ac529df661a0d9b5aa2418392875dd62f9aecf46dc33ff7dfd43c"} Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.138813 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn" event={"ID":"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3","Type":"ContainerStarted","Data":"e332fd58bac36920eb992b11914f683b07598dd8216c452a72973706723d6019"} Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.156284 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.156738 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" podStartSLOduration=167.156722867 podStartE2EDuration="2m47.156722867s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:39.155650421 +0000 UTC m=+228.397529618" watchObservedRunningTime="2026-03-01 09:11:39.156722867 +0000 UTC m=+228.398602064" Mar 01 09:11:39 crc kubenswrapper[4792]: E0301 09:11:39.159555 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:39.659541296 +0000 UTC m=+228.901420573 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.176289 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-r7d4f" event={"ID":"d25464be-fe72-4409-a934-9e8c70542ed6","Type":"ContainerStarted","Data":"d2e31f8f3a3cfe7df3dfe74597fc3ce0a32ea60b69cfa1435f8678ab36607202"} Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.215166 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rjwhk" event={"ID":"9c283b49-5e58-4c99-97c2-d53ab428265f","Type":"ContainerStarted","Data":"4880a13bcfcb4c6ccb79a8e246e35f9ebc144536cdd75e778c6a07e018feab1d"} Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.248296 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" podUID="44c647cb-a9e2-4e75-abb3-5d3cdbe881a2" containerName="route-controller-manager" containerID="cri-o://ee95da5910e6e2c13434125c99652d9a77af8d0b5457ae7a7ce20a9282ee3b00" gracePeriod=30 Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.249052 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-smnq2" event={"ID":"d566570d-4f58-487b-b824-839792e88650","Type":"ContainerStarted","Data":"f0dac4830900a0ec6f28445041d7a79b7ef7e37a2991e9e677f9b5b3a8edb055"} Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.259391 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:39 crc kubenswrapper[4792]: E0301 09:11:39.260043 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:39.76002848 +0000 UTC m=+229.001907677 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.320418 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kq5kp" podStartSLOduration=168.320401057 podStartE2EDuration="2m48.320401057s" podCreationTimestamp="2026-03-01 09:08:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:39.240356676 +0000 UTC m=+228.482235873" watchObservedRunningTime="2026-03-01 09:11:39.320401057 +0000 UTC m=+228.562280254" Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.361160 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:39 crc kubenswrapper[4792]: E0301 09:11:39.361998 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:39.86198053 +0000 UTC m=+229.103859727 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.410215 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj" podStartSLOduration=167.410194277 podStartE2EDuration="2m47.410194277s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:39.329202903 +0000 UTC m=+228.571082100" watchObservedRunningTime="2026-03-01 09:11:39.410194277 +0000 UTC m=+228.652073474" Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.410828 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r9fc7" podStartSLOduration=167.410823283 podStartE2EDuration="2m47.410823283s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:39.404986569 +0000 UTC m=+228.646865766" watchObservedRunningTime="2026-03-01 09:11:39.410823283 +0000 UTC m=+228.652702480" Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.463259 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:39 crc kubenswrapper[4792]: E0301 09:11:39.463443 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:39.963416557 +0000 UTC m=+229.205295754 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.463509 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:39 crc kubenswrapper[4792]: E0301 09:11:39.463832 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:39.963824867 +0000 UTC m=+229.205704064 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.539052 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-8qrq4" Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.541604 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn" podStartSLOduration=167.541594322 podStartE2EDuration="2m47.541594322s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:39.53135845 +0000 UTC m=+228.773237637" watchObservedRunningTime="2026-03-01 09:11:39.541594322 +0000 UTC m=+228.783473519" Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.542074 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" podStartSLOduration=167.542069834 podStartE2EDuration="2m47.542069834s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:39.504824727 +0000 UTC m=+228.746703924" watchObservedRunningTime="2026-03-01 09:11:39.542069834 +0000 UTC m=+228.783949031" Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.564462 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:39 crc kubenswrapper[4792]: E0301 09:11:39.564720 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:40.064704191 +0000 UTC m=+229.306583388 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.564830 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:39 crc kubenswrapper[4792]: E0301 09:11:39.565177 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:40.065167422 +0000 UTC m=+229.307046619 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.605773 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:39 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:39 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:39 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.605822 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.665610 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:39 crc kubenswrapper[4792]: E0301 09:11:39.665756 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:40.165731958 +0000 UTC m=+229.407611155 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.666011 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:39 crc kubenswrapper[4792]: E0301 09:11:39.666290 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:40.166277941 +0000 UTC m=+229.408157138 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.729928 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-r7d4f" podStartSLOduration=167.729914208 podStartE2EDuration="2m47.729914208s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:39.723581342 +0000 UTC m=+228.965460539" watchObservedRunningTime="2026-03-01 09:11:39.729914208 +0000 UTC m=+228.971793395" Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.730355 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-smnq2" podStartSLOduration=167.730350979 podStartE2EDuration="2m47.730350979s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:39.661173046 +0000 UTC m=+228.903052243" watchObservedRunningTime="2026-03-01 09:11:39.730350979 +0000 UTC m=+228.972230176" Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.767468 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:39 crc kubenswrapper[4792]: E0301 09:11:39.767892 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:40.267873482 +0000 UTC m=+229.509752679 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.832942 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d" podStartSLOduration=167.832899433 podStartE2EDuration="2m47.832899433s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:39.83237306 +0000 UTC m=+229.074252257" watchObservedRunningTime="2026-03-01 09:11:39.832899433 +0000 UTC m=+229.074778620" Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.862580 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l" podStartSLOduration=167.862566614 podStartE2EDuration="2m47.862566614s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:39.861283422 +0000 UTC m=+229.103162619" watchObservedRunningTime="2026-03-01 09:11:39.862566614 +0000 UTC m=+229.104445811" Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.869031 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:39 crc kubenswrapper[4792]: E0301 09:11:39.869372 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:40.369358351 +0000 UTC m=+229.611237548 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.970645 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:39 crc kubenswrapper[4792]: E0301 09:11:39.971109 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:40.471092965 +0000 UTC m=+229.712972162 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:39 crc kubenswrapper[4792]: I0301 09:11:39.971206 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:39 crc kubenswrapper[4792]: E0301 09:11:39.971506 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:40.471498395 +0000 UTC m=+229.713377582 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.072760 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:40 crc kubenswrapper[4792]: E0301 09:11:40.073101 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:40.573086576 +0000 UTC m=+229.814965773 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.176576 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:40 crc kubenswrapper[4792]: E0301 09:11:40.177052 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:40.677031745 +0000 UTC m=+229.918910942 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.230070 4792 ???:1] "http: TLS handshake error from 192.168.126.11:37472: no serving certificate available for the kubelet" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.259917 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kxx8s" event={"ID":"f3480f1b-eedb-4bc8-b40f-5c527869096a","Type":"ContainerStarted","Data":"2cd2b1c1fbe79104b79a7890f1c68da705c6c5e1f5dbbfa0920403dc7e96efba"} Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.264640 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rjwhk" event={"ID":"9c283b49-5e58-4c99-97c2-d53ab428265f","Type":"ContainerStarted","Data":"e53e3a76c13c685b981c673a044faf023e87e94415d3dcf45af467741eee12f5"} Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.268286 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nc4gs" event={"ID":"9ab8f156-05d7-47d8-b849-a49f1c5cf03b","Type":"ContainerStarted","Data":"84811a8ce860a6a4cc0a951effc898bb30e8e3deb4250372376b597312011ceb"} Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.268458 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nc4gs" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.270754 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-64dsw" event={"ID":"9b2af767-57b1-4774-9668-6610e9ac1bb9","Type":"ContainerStarted","Data":"6fb9ac3e107496f78583b693f70c6629624ebf04472a6d608ffab0ddb5dba2bc"} Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.279576 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:40 crc kubenswrapper[4792]: E0301 09:11:40.280002 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:40.77998429 +0000 UTC m=+230.021863487 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.304315 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ffd8l" event={"ID":"04b15432-c193-4b0c-b527-df9a9b37c886","Type":"ContainerStarted","Data":"c365ade0cbb23c31a7311c8d7fe1c2e6fab58e5adf5f5b03ac2553d4a5c21143"} Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.326425 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8" event={"ID":"011a4c5f-1a18-4f0d-884f-43bb6477efb6","Type":"ContainerStarted","Data":"5dc3ca91e3a6774ef6c0336afe891e3614c113107660d02914bf8d650bab1f0a"} Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.328817 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-wpgg2" event={"ID":"644b4b74-7ce9-4d36-8938-58a1e2b2b49f","Type":"ContainerStarted","Data":"cbc7ed6e024b61eac779c0b97e21c1f04fb520a328aa8f9223497fda65407cf5"} Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.340523 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" event={"ID":"499393fc-abcf-4998-9e32-3d43a0b1e488","Type":"ContainerStarted","Data":"84b95879ba5e2c4dff9cbfdb619c290685a9892fa4f1d69242e07c9ebdc9f878"} Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.340570 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" event={"ID":"499393fc-abcf-4998-9e32-3d43a0b1e488","Type":"ContainerStarted","Data":"6a2e2c0b304e235ef71551cb9a82a7f076605c19141d9dc370a420d2898365ee"} Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.343514 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kxx8s" podStartSLOduration=168.343498203 podStartE2EDuration="2m48.343498203s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:40.343169835 +0000 UTC m=+229.585049032" watchObservedRunningTime="2026-03-01 09:11:40.343498203 +0000 UTC m=+229.585377400" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.346602 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dgh8q" event={"ID":"bd837cd0-c714-48e1-8771-cc6c419f7639","Type":"ContainerStarted","Data":"80b46179a65824ed46675ea3d847fe75d4b6f791a318c5df26408d25529b470b"} Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.356233 4792 generic.go:334] "Generic (PLEG): container finished" podID="44c647cb-a9e2-4e75-abb3-5d3cdbe881a2" containerID="ee95da5910e6e2c13434125c99652d9a77af8d0b5457ae7a7ce20a9282ee3b00" exitCode=0 Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.357002 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" event={"ID":"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2","Type":"ContainerDied","Data":"ee95da5910e6e2c13434125c99652d9a77af8d0b5457ae7a7ce20a9282ee3b00"} Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.357483 4792 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-4jk5c container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.357519 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" podUID="b72d8baa-f3f8-4263-a36d-9741ad4243d5" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.357572 4792 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-7l7sj container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.357608 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj" podUID="5c679376-bb09-4944-b4ee-3710661612b5" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.368177 4792 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-gk6c6 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.368245 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" podUID="e37e6dcb-be13-4787-8555-3ba1050f7b77" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.368403 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" podUID="8578f8dc-143c-423c-b62b-b3190444bafd" containerName="controller-manager" containerID="cri-o://c13739edba999be4dbdb0675da20f693f5c2f873f236f49fe221573f802f1672" gracePeriod=30 Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.380932 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:40 crc kubenswrapper[4792]: E0301 09:11:40.382698 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:40.882676608 +0000 UTC m=+230.124555805 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.455348 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qsc9d" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.482660 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:40 crc kubenswrapper[4792]: E0301 09:11:40.482859 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:40.982827693 +0000 UTC m=+230.224706890 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.487608 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:40 crc kubenswrapper[4792]: E0301 09:11:40.488143 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:40.988130314 +0000 UTC m=+230.230009511 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.513265 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vgrb8" podStartSLOduration=168.513248482 podStartE2EDuration="2m48.513248482s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:40.42947762 +0000 UTC m=+229.671356817" watchObservedRunningTime="2026-03-01 09:11:40.513248482 +0000 UTC m=+229.755127679" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.568016 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" podStartSLOduration=169.56799688 podStartE2EDuration="2m49.56799688s" podCreationTimestamp="2026-03-01 09:08:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:40.515687362 +0000 UTC m=+229.757566569" watchObservedRunningTime="2026-03-01 09:11:40.56799688 +0000 UTC m=+229.809876077" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.589639 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:40 crc kubenswrapper[4792]: E0301 09:11:40.590283 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:41.090268839 +0000 UTC m=+230.332148036 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.609769 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nc4gs" podStartSLOduration=168.609751268 podStartE2EDuration="2m48.609751268s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:40.570885631 +0000 UTC m=+229.812764828" watchObservedRunningTime="2026-03-01 09:11:40.609751268 +0000 UTC m=+229.851630465" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.644136 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:40 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:40 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:40 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.644195 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.652554 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-wpgg2" podStartSLOduration=168.652537511 podStartE2EDuration="2m48.652537511s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:40.610317812 +0000 UTC m=+229.852197009" watchObservedRunningTime="2026-03-01 09:11:40.652537511 +0000 UTC m=+229.894416708" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.691518 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:40 crc kubenswrapper[4792]: E0301 09:11:40.691865 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:41.191854019 +0000 UTC m=+230.433733216 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.792627 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:40 crc kubenswrapper[4792]: E0301 09:11:40.793010 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:41.292992899 +0000 UTC m=+230.534872096 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.793075 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:40 crc kubenswrapper[4792]: E0301 09:11:40.793328 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:41.293317907 +0000 UTC m=+230.535197104 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.851310 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.886586 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-dgh8q" podStartSLOduration=9.886564413 podStartE2EDuration="9.886564413s" podCreationTimestamp="2026-03-01 09:11:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:40.716482686 +0000 UTC m=+229.958361883" watchObservedRunningTime="2026-03-01 09:11:40.886564413 +0000 UTC m=+230.128443610" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.889214 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp"] Mar 01 09:11:40 crc kubenswrapper[4792]: E0301 09:11:40.889403 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44c647cb-a9e2-4e75-abb3-5d3cdbe881a2" containerName="route-controller-manager" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.889418 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="44c647cb-a9e2-4e75-abb3-5d3cdbe881a2" containerName="route-controller-manager" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.889512 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="44c647cb-a9e2-4e75-abb3-5d3cdbe881a2" containerName="route-controller-manager" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.889852 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.893848 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:40 crc kubenswrapper[4792]: E0301 09:11:40.894181 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:41.3941634 +0000 UTC m=+230.636042597 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.908088 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp"] Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.995373 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-config\") pod \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\" (UID: \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\") " Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.996127 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-client-ca" (OuterVolumeSpecName: "client-ca") pod "44c647cb-a9e2-4e75-abb3-5d3cdbe881a2" (UID: "44c647cb-a9e2-4e75-abb3-5d3cdbe881a2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.996161 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-client-ca\") pod \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\" (UID: \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\") " Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.996227 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnjdm\" (UniqueName: \"kubernetes.io/projected/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-kube-api-access-fnjdm\") pod \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\" (UID: \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\") " Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.996254 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-serving-cert\") pod \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\" (UID: \"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2\") " Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.996375 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e3e715b-f024-4842-94ed-1f1e054e89c6-client-ca\") pod \"route-controller-manager-8ddf4cb6d-chpwp\" (UID: \"7e3e715b-f024-4842-94ed-1f1e054e89c6\") " pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.996424 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwpwk\" (UniqueName: \"kubernetes.io/projected/7e3e715b-f024-4842-94ed-1f1e054e89c6-kube-api-access-pwpwk\") pod \"route-controller-manager-8ddf4cb6d-chpwp\" (UID: \"7e3e715b-f024-4842-94ed-1f1e054e89c6\") " pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.996451 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e3e715b-f024-4842-94ed-1f1e054e89c6-config\") pod \"route-controller-manager-8ddf4cb6d-chpwp\" (UID: \"7e3e715b-f024-4842-94ed-1f1e054e89c6\") " pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.996484 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.996510 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e3e715b-f024-4842-94ed-1f1e054e89c6-serving-cert\") pod \"route-controller-manager-8ddf4cb6d-chpwp\" (UID: \"7e3e715b-f024-4842-94ed-1f1e054e89c6\") " pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.996572 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-client-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:11:40 crc kubenswrapper[4792]: E0301 09:11:40.997555 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:41.497543775 +0000 UTC m=+230.739422972 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:40 crc kubenswrapper[4792]: I0301 09:11:40.997742 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-config" (OuterVolumeSpecName: "config") pod "44c647cb-a9e2-4e75-abb3-5d3cdbe881a2" (UID: "44c647cb-a9e2-4e75-abb3-5d3cdbe881a2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.017164 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-kube-api-access-fnjdm" (OuterVolumeSpecName: "kube-api-access-fnjdm") pod "44c647cb-a9e2-4e75-abb3-5d3cdbe881a2" (UID: "44c647cb-a9e2-4e75-abb3-5d3cdbe881a2"). InnerVolumeSpecName "kube-api-access-fnjdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.021237 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "44c647cb-a9e2-4e75-abb3-5d3cdbe881a2" (UID: "44c647cb-a9e2-4e75-abb3-5d3cdbe881a2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.097033 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:41 crc kubenswrapper[4792]: E0301 09:11:41.097367 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:41.597315401 +0000 UTC m=+230.839194598 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.097516 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e3e715b-f024-4842-94ed-1f1e054e89c6-serving-cert\") pod \"route-controller-manager-8ddf4cb6d-chpwp\" (UID: \"7e3e715b-f024-4842-94ed-1f1e054e89c6\") " pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.097583 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e3e715b-f024-4842-94ed-1f1e054e89c6-client-ca\") pod \"route-controller-manager-8ddf4cb6d-chpwp\" (UID: \"7e3e715b-f024-4842-94ed-1f1e054e89c6\") " pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.097623 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwpwk\" (UniqueName: \"kubernetes.io/projected/7e3e715b-f024-4842-94ed-1f1e054e89c6-kube-api-access-pwpwk\") pod \"route-controller-manager-8ddf4cb6d-chpwp\" (UID: \"7e3e715b-f024-4842-94ed-1f1e054e89c6\") " pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.097648 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e3e715b-f024-4842-94ed-1f1e054e89c6-config\") pod \"route-controller-manager-8ddf4cb6d-chpwp\" (UID: \"7e3e715b-f024-4842-94ed-1f1e054e89c6\") " pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.097677 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.097708 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.097718 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnjdm\" (UniqueName: \"kubernetes.io/projected/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-kube-api-access-fnjdm\") on node \"crc\" DevicePath \"\"" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.097730 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:11:41 crc kubenswrapper[4792]: E0301 09:11:41.097992 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:41.597979808 +0000 UTC m=+230.839859005 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.100063 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e3e715b-f024-4842-94ed-1f1e054e89c6-client-ca\") pod \"route-controller-manager-8ddf4cb6d-chpwp\" (UID: \"7e3e715b-f024-4842-94ed-1f1e054e89c6\") " pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.101405 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e3e715b-f024-4842-94ed-1f1e054e89c6-config\") pod \"route-controller-manager-8ddf4cb6d-chpwp\" (UID: \"7e3e715b-f024-4842-94ed-1f1e054e89c6\") " pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.103148 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e3e715b-f024-4842-94ed-1f1e054e89c6-serving-cert\") pod \"route-controller-manager-8ddf4cb6d-chpwp\" (UID: \"7e3e715b-f024-4842-94ed-1f1e054e89c6\") " pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.124209 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwpwk\" (UniqueName: \"kubernetes.io/projected/7e3e715b-f024-4842-94ed-1f1e054e89c6-kube-api-access-pwpwk\") pod \"route-controller-manager-8ddf4cb6d-chpwp\" (UID: \"7e3e715b-f024-4842-94ed-1f1e054e89c6\") " pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.198603 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:41 crc kubenswrapper[4792]: E0301 09:11:41.198800 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:41.698774659 +0000 UTC m=+230.940653856 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.199019 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:41 crc kubenswrapper[4792]: E0301 09:11:41.199323 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:41.699310842 +0000 UTC m=+230.941190039 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.203262 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.302398 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:41 crc kubenswrapper[4792]: E0301 09:11:41.302872 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:41.802855101 +0000 UTC m=+231.044734298 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.380495 4792 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-2l2w7 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.380559 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" podUID="9ee9e9d4-e788-41cb-b601-035551b5338c" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.403871 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:41 crc kubenswrapper[4792]: E0301 09:11:41.404228 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:41.904212107 +0000 UTC m=+231.146091304 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.421835 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.427377 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62" event={"ID":"44c647cb-a9e2-4e75-abb3-5d3cdbe881a2","Type":"ContainerDied","Data":"c11caa71735c72b6244f64898704ac0350a94ec2fdec7c12cfb52f25b0fabfa9"} Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.427421 4792 scope.go:117] "RemoveContainer" containerID="ee95da5910e6e2c13434125c99652d9a77af8d0b5457ae7a7ce20a9282ee3b00" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.448418 4792 generic.go:334] "Generic (PLEG): container finished" podID="8578f8dc-143c-423c-b62b-b3190444bafd" containerID="c13739edba999be4dbdb0675da20f693f5c2f873f236f49fe221573f802f1672" exitCode=0 Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.448479 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" event={"ID":"8578f8dc-143c-423c-b62b-b3190444bafd","Type":"ContainerDied","Data":"c13739edba999be4dbdb0675da20f693f5c2f873f236f49fe221573f802f1672"} Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.457251 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.462169 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rjwhk" event={"ID":"9c283b49-5e58-4c99-97c2-d53ab428265f","Type":"ContainerStarted","Data":"f4727a8f5737e6c7fab94f58e28153995e9689650f62a525e2ac2573957fbb21"} Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.462642 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-rjwhk" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.479646 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7l7sj" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.504552 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:41 crc kubenswrapper[4792]: E0301 09:11:41.504707 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:42.00467728 +0000 UTC m=+231.246556477 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.504738 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:41 crc kubenswrapper[4792]: E0301 09:11:41.505163 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:42.005150751 +0000 UTC m=+231.247029948 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.624159 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.624533 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8578f8dc-143c-423c-b62b-b3190444bafd-proxy-ca-bundles\") pod \"8578f8dc-143c-423c-b62b-b3190444bafd\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.624568 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8578f8dc-143c-423c-b62b-b3190444bafd-config\") pod \"8578f8dc-143c-423c-b62b-b3190444bafd\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.624601 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8578f8dc-143c-423c-b62b-b3190444bafd-client-ca\") pod \"8578f8dc-143c-423c-b62b-b3190444bafd\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.624628 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8578f8dc-143c-423c-b62b-b3190444bafd-serving-cert\") pod \"8578f8dc-143c-423c-b62b-b3190444bafd\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.624647 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qq5x\" (UniqueName: \"kubernetes.io/projected/8578f8dc-143c-423c-b62b-b3190444bafd-kube-api-access-6qq5x\") pod \"8578f8dc-143c-423c-b62b-b3190444bafd\" (UID: \"8578f8dc-143c-423c-b62b-b3190444bafd\") " Mar 01 09:11:41 crc kubenswrapper[4792]: E0301 09:11:41.626679 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:42.126659583 +0000 UTC m=+231.368538780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.630806 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8578f8dc-143c-423c-b62b-b3190444bafd-client-ca" (OuterVolumeSpecName: "client-ca") pod "8578f8dc-143c-423c-b62b-b3190444bafd" (UID: "8578f8dc-143c-423c-b62b-b3190444bafd"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.630866 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8578f8dc-143c-423c-b62b-b3190444bafd-config" (OuterVolumeSpecName: "config") pod "8578f8dc-143c-423c-b62b-b3190444bafd" (UID: "8578f8dc-143c-423c-b62b-b3190444bafd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.636528 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:41 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:41 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:41 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.636578 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.645473 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8578f8dc-143c-423c-b62b-b3190444bafd-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "8578f8dc-143c-423c-b62b-b3190444bafd" (UID: "8578f8dc-143c-423c-b62b-b3190444bafd"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.659899 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8578f8dc-143c-423c-b62b-b3190444bafd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8578f8dc-143c-423c-b62b-b3190444bafd" (UID: "8578f8dc-143c-423c-b62b-b3190444bafd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.732416 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8578f8dc-143c-423c-b62b-b3190444bafd-kube-api-access-6qq5x" (OuterVolumeSpecName: "kube-api-access-6qq5x") pod "8578f8dc-143c-423c-b62b-b3190444bafd" (UID: "8578f8dc-143c-423c-b62b-b3190444bafd"). InnerVolumeSpecName "kube-api-access-6qq5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.732928 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.733030 4792 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8578f8dc-143c-423c-b62b-b3190444bafd-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.733041 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8578f8dc-143c-423c-b62b-b3190444bafd-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.733051 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8578f8dc-143c-423c-b62b-b3190444bafd-client-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.733058 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8578f8dc-143c-423c-b62b-b3190444bafd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.733068 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qq5x\" (UniqueName: \"kubernetes.io/projected/8578f8dc-143c-423c-b62b-b3190444bafd-kube-api-access-6qq5x\") on node \"crc\" DevicePath \"\"" Mar 01 09:11:41 crc kubenswrapper[4792]: E0301 09:11:41.733514 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:42.233502173 +0000 UTC m=+231.475381370 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.772493 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-rjwhk" podStartSLOduration=10.772470993 podStartE2EDuration="10.772470993s" podCreationTimestamp="2026-03-01 09:11:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:41.648270765 +0000 UTC m=+230.890149962" watchObservedRunningTime="2026-03-01 09:11:41.772470993 +0000 UTC m=+231.014350190" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.821504 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62"] Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.837334 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.837867 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.837934 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.837957 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.837979 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:11:41 crc kubenswrapper[4792]: E0301 09:11:41.848234 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:42.348208707 +0000 UTC m=+231.590087904 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.849162 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-knb62"] Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.850509 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.851570 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.859885 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.866681 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.876865 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.950666 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:41 crc kubenswrapper[4792]: E0301 09:11:41.951046 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:42.451034338 +0000 UTC m=+231.692913535 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:41 crc kubenswrapper[4792]: I0301 09:11:41.980501 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp"] Mar 01 09:11:42 crc kubenswrapper[4792]: W0301 09:11:42.003275 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e3e715b_f024_4842_94ed_1f1e054e89c6.slice/crio-92e007ff2e490cd943aeebddedb6f93d838bbb65607bec7ae9a1be207a92d259 WatchSource:0}: Error finding container 92e007ff2e490cd943aeebddedb6f93d838bbb65607bec7ae9a1be207a92d259: Status 404 returned error can't find the container with id 92e007ff2e490cd943aeebddedb6f93d838bbb65607bec7ae9a1be207a92d259 Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.051939 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:42 crc kubenswrapper[4792]: E0301 09:11:42.052167 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:42.552135787 +0000 UTC m=+231.794014984 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.052311 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:42 crc kubenswrapper[4792]: E0301 09:11:42.052686 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:42.552670701 +0000 UTC m=+231.794549958 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.134968 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.153369 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:42 crc kubenswrapper[4792]: E0301 09:11:42.153775 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:42.653756589 +0000 UTC m=+231.895635786 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.178872 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.228661 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jk5c" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.255743 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:42 crc kubenswrapper[4792]: E0301 09:11:42.256103 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:42.756091308 +0000 UTC m=+231.997970505 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.357419 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:42 crc kubenswrapper[4792]: E0301 09:11:42.357712 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:42.85769898 +0000 UTC m=+232.099578177 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.448588 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2l2w7" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.459646 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:42 crc kubenswrapper[4792]: E0301 09:11:42.459953 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:42.959941927 +0000 UTC m=+232.201821124 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.528269 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.536126 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wl9zt" event={"ID":"8578f8dc-143c-423c-b62b-b3190444bafd","Type":"ContainerDied","Data":"452f21dc7923df996fc4ebcc58043ac03b69b3315c7778ffe5676b68b45c4e4f"} Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.536185 4792 scope.go:117] "RemoveContainer" containerID="c13739edba999be4dbdb0675da20f693f5c2f873f236f49fe221573f802f1672" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.558974 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" event={"ID":"7e3e715b-f024-4842-94ed-1f1e054e89c6","Type":"ContainerStarted","Data":"3e137621be8fd281e91660714989b3524d34fe04d0d9acd500e98e9bc50b05d8"} Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.559013 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" event={"ID":"7e3e715b-f024-4842-94ed-1f1e054e89c6","Type":"ContainerStarted","Data":"92e007ff2e490cd943aeebddedb6f93d838bbb65607bec7ae9a1be207a92d259"} Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.559741 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.560012 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:42 crc kubenswrapper[4792]: E0301 09:11:42.564997 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:43.064969612 +0000 UTC m=+232.306848809 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.567002 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-64dsw" event={"ID":"9b2af767-57b1-4774-9668-6610e9ac1bb9","Type":"ContainerStarted","Data":"8af0acb3b09b5977996abac07474f1904f77cdc88726857516e42095b8e57ae7"} Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.583724 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wl9zt"] Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.585240 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wl9zt"] Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.597368 4792 patch_prober.go:28] interesting pod/route-controller-manager-8ddf4cb6d-chpwp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" start-of-body= Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.597560 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" podUID="7e3e715b-f024-4842-94ed-1f1e054e89c6" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.606623 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:42 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:42 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:42 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.606677 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.609177 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" podStartSLOduration=3.609163379 podStartE2EDuration="3.609163379s" podCreationTimestamp="2026-03-01 09:11:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:42.604353981 +0000 UTC m=+231.846233188" watchObservedRunningTime="2026-03-01 09:11:42.609163379 +0000 UTC m=+231.851042576" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.660984 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:42 crc kubenswrapper[4792]: E0301 09:11:42.662041 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:43.162029761 +0000 UTC m=+232.403908958 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.669288 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wxb87"] Mar 01 09:11:42 crc kubenswrapper[4792]: E0301 09:11:42.669492 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8578f8dc-143c-423c-b62b-b3190444bafd" containerName="controller-manager" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.669502 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8578f8dc-143c-423c-b62b-b3190444bafd" containerName="controller-manager" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.669623 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="8578f8dc-143c-423c-b62b-b3190444bafd" containerName="controller-manager" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.670520 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wxb87" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.674549 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.707822 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wxb87"] Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.764024 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:42 crc kubenswrapper[4792]: E0301 09:11:42.764462 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:43.264435262 +0000 UTC m=+232.506314459 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.764556 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9073e3da-2d6f-48a3-907a-e347f28559ae-catalog-content\") pod \"community-operators-wxb87\" (UID: \"9073e3da-2d6f-48a3-907a-e347f28559ae\") " pod="openshift-marketplace/community-operators-wxb87" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.764580 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc28k\" (UniqueName: \"kubernetes.io/projected/9073e3da-2d6f-48a3-907a-e347f28559ae-kube-api-access-nc28k\") pod \"community-operators-wxb87\" (UID: \"9073e3da-2d6f-48a3-907a-e347f28559ae\") " pod="openshift-marketplace/community-operators-wxb87" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.764614 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.764674 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9073e3da-2d6f-48a3-907a-e347f28559ae-utilities\") pod \"community-operators-wxb87\" (UID: \"9073e3da-2d6f-48a3-907a-e347f28559ae\") " pod="openshift-marketplace/community-operators-wxb87" Mar 01 09:11:42 crc kubenswrapper[4792]: E0301 09:11:42.765174 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:43.26516457 +0000 UTC m=+232.507043767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.829306 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cw675"] Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.852384 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cw675" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.856708 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.865342 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.865512 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9073e3da-2d6f-48a3-907a-e347f28559ae-catalog-content\") pod \"community-operators-wxb87\" (UID: \"9073e3da-2d6f-48a3-907a-e347f28559ae\") " pod="openshift-marketplace/community-operators-wxb87" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.865538 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc28k\" (UniqueName: \"kubernetes.io/projected/9073e3da-2d6f-48a3-907a-e347f28559ae-kube-api-access-nc28k\") pod \"community-operators-wxb87\" (UID: \"9073e3da-2d6f-48a3-907a-e347f28559ae\") " pod="openshift-marketplace/community-operators-wxb87" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.865588 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9073e3da-2d6f-48a3-907a-e347f28559ae-utilities\") pod \"community-operators-wxb87\" (UID: \"9073e3da-2d6f-48a3-907a-e347f28559ae\") " pod="openshift-marketplace/community-operators-wxb87" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.866039 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9073e3da-2d6f-48a3-907a-e347f28559ae-utilities\") pod \"community-operators-wxb87\" (UID: \"9073e3da-2d6f-48a3-907a-e347f28559ae\") " pod="openshift-marketplace/community-operators-wxb87" Mar 01 09:11:42 crc kubenswrapper[4792]: E0301 09:11:42.866123 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:43.366104765 +0000 UTC m=+232.607983972 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.866291 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9073e3da-2d6f-48a3-907a-e347f28559ae-catalog-content\") pod \"community-operators-wxb87\" (UID: \"9073e3da-2d6f-48a3-907a-e347f28559ae\") " pod="openshift-marketplace/community-operators-wxb87" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.870348 4792 ???:1] "http: TLS handshake error from 192.168.126.11:37478: no serving certificate available for the kubelet" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.881838 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cw675"] Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.914274 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc28k\" (UniqueName: \"kubernetes.io/projected/9073e3da-2d6f-48a3-907a-e347f28559ae-kube-api-access-nc28k\") pod \"community-operators-wxb87\" (UID: \"9073e3da-2d6f-48a3-907a-e347f28559ae\") " pod="openshift-marketplace/community-operators-wxb87" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.970390 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thnsb\" (UniqueName: \"kubernetes.io/projected/dff0d675-52dd-4cac-a7be-8750333c28e3-kube-api-access-thnsb\") pod \"certified-operators-cw675\" (UID: \"dff0d675-52dd-4cac-a7be-8750333c28e3\") " pod="openshift-marketplace/certified-operators-cw675" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.970655 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dff0d675-52dd-4cac-a7be-8750333c28e3-catalog-content\") pod \"certified-operators-cw675\" (UID: \"dff0d675-52dd-4cac-a7be-8750333c28e3\") " pod="openshift-marketplace/certified-operators-cw675" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.970690 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:42 crc kubenswrapper[4792]: I0301 09:11:42.970733 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dff0d675-52dd-4cac-a7be-8750333c28e3-utilities\") pod \"certified-operators-cw675\" (UID: \"dff0d675-52dd-4cac-a7be-8750333c28e3\") " pod="openshift-marketplace/certified-operators-cw675" Mar 01 09:11:42 crc kubenswrapper[4792]: E0301 09:11:42.971015 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:43.471003657 +0000 UTC m=+232.712882854 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.020188 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p6b9w"] Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.020238 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wxb87" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.027457 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p6b9w" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.035435 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p6b9w"] Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.073298 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.073545 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dff0d675-52dd-4cac-a7be-8750333c28e3-utilities\") pod \"certified-operators-cw675\" (UID: \"dff0d675-52dd-4cac-a7be-8750333c28e3\") " pod="openshift-marketplace/certified-operators-cw675" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.073584 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thnsb\" (UniqueName: \"kubernetes.io/projected/dff0d675-52dd-4cac-a7be-8750333c28e3-kube-api-access-thnsb\") pod \"certified-operators-cw675\" (UID: \"dff0d675-52dd-4cac-a7be-8750333c28e3\") " pod="openshift-marketplace/certified-operators-cw675" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.073643 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dff0d675-52dd-4cac-a7be-8750333c28e3-catalog-content\") pod \"certified-operators-cw675\" (UID: \"dff0d675-52dd-4cac-a7be-8750333c28e3\") " pod="openshift-marketplace/certified-operators-cw675" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.074322 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dff0d675-52dd-4cac-a7be-8750333c28e3-catalog-content\") pod \"certified-operators-cw675\" (UID: \"dff0d675-52dd-4cac-a7be-8750333c28e3\") " pod="openshift-marketplace/certified-operators-cw675" Mar 01 09:11:43 crc kubenswrapper[4792]: E0301 09:11:43.074389 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:43.574375692 +0000 UTC m=+232.816254889 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.078797 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dff0d675-52dd-4cac-a7be-8750333c28e3-utilities\") pod \"certified-operators-cw675\" (UID: \"dff0d675-52dd-4cac-a7be-8750333c28e3\") " pod="openshift-marketplace/certified-operators-cw675" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.105976 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thnsb\" (UniqueName: \"kubernetes.io/projected/dff0d675-52dd-4cac-a7be-8750333c28e3-kube-api-access-thnsb\") pod \"certified-operators-cw675\" (UID: \"dff0d675-52dd-4cac-a7be-8750333c28e3\") " pod="openshift-marketplace/certified-operators-cw675" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.135534 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.143427 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.145870 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.151339 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.154974 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.175667 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fd91972-6bfc-4041-abc2-8f4298584603-utilities\") pod \"community-operators-p6b9w\" (UID: \"6fd91972-6bfc-4041-abc2-8f4298584603\") " pod="openshift-marketplace/community-operators-p6b9w" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.175723 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.175748 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fd91972-6bfc-4041-abc2-8f4298584603-catalog-content\") pod \"community-operators-p6b9w\" (UID: \"6fd91972-6bfc-4041-abc2-8f4298584603\") " pod="openshift-marketplace/community-operators-p6b9w" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.175802 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whj6h\" (UniqueName: \"kubernetes.io/projected/6fd91972-6bfc-4041-abc2-8f4298584603-kube-api-access-whj6h\") pod \"community-operators-p6b9w\" (UID: \"6fd91972-6bfc-4041-abc2-8f4298584603\") " pod="openshift-marketplace/community-operators-p6b9w" Mar 01 09:11:43 crc kubenswrapper[4792]: E0301 09:11:43.176069 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:43.676058706 +0000 UTC m=+232.917937903 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.188629 4792 patch_prober.go:28] interesting pod/downloads-7954f5f757-wxl8v container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" start-of-body= Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.188837 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wxl8v" podUID="6cc55bdf-6c0f-4d35-879f-c64c2dc4897c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.189056 4792 patch_prober.go:28] interesting pod/downloads-7954f5f757-wxl8v container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" start-of-body= Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.189111 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-wxl8v" podUID="6cc55bdf-6c0f-4d35-879f-c64c2dc4897c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.206812 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.206842 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.211784 4792 patch_prober.go:28] interesting pod/console-f9d7485db-zrzcg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.211852 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-zrzcg" podUID="86788093-42e5-4fa0-9595-97a910e6557e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.226464 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-l7ngd"] Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.228164 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cw675" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.242249 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l7ngd" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.250033 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l7ngd"] Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.276399 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.276549 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fd91972-6bfc-4041-abc2-8f4298584603-catalog-content\") pod \"community-operators-p6b9w\" (UID: \"6fd91972-6bfc-4041-abc2-8f4298584603\") " pod="openshift-marketplace/community-operators-p6b9w" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.276606 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab5a35f7-19e5-4496-b030-59dfa49a64cf-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ab5a35f7-19e5-4496-b030-59dfa49a64cf\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.276669 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab5a35f7-19e5-4496-b030-59dfa49a64cf-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ab5a35f7-19e5-4496-b030-59dfa49a64cf\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.276691 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whj6h\" (UniqueName: \"kubernetes.io/projected/6fd91972-6bfc-4041-abc2-8f4298584603-kube-api-access-whj6h\") pod \"community-operators-p6b9w\" (UID: \"6fd91972-6bfc-4041-abc2-8f4298584603\") " pod="openshift-marketplace/community-operators-p6b9w" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.276729 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fd91972-6bfc-4041-abc2-8f4298584603-utilities\") pod \"community-operators-p6b9w\" (UID: \"6fd91972-6bfc-4041-abc2-8f4298584603\") " pod="openshift-marketplace/community-operators-p6b9w" Mar 01 09:11:43 crc kubenswrapper[4792]: E0301 09:11:43.277422 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:43.777407531 +0000 UTC m=+233.019286718 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.277756 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fd91972-6bfc-4041-abc2-8f4298584603-catalog-content\") pod \"community-operators-p6b9w\" (UID: \"6fd91972-6bfc-4041-abc2-8f4298584603\") " pod="openshift-marketplace/community-operators-p6b9w" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.279155 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fd91972-6bfc-4041-abc2-8f4298584603-utilities\") pod \"community-operators-p6b9w\" (UID: \"6fd91972-6bfc-4041-abc2-8f4298584603\") " pod="openshift-marketplace/community-operators-p6b9w" Mar 01 09:11:43 crc kubenswrapper[4792]: W0301 09:11:43.329225 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-9bfd625c54787a996c945bf828a16cb9a2d42053182e44eae51137d2a911fd8e WatchSource:0}: Error finding container 9bfd625c54787a996c945bf828a16cb9a2d42053182e44eae51137d2a911fd8e: Status 404 returned error can't find the container with id 9bfd625c54787a996c945bf828a16cb9a2d42053182e44eae51137d2a911fd8e Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.336727 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whj6h\" (UniqueName: \"kubernetes.io/projected/6fd91972-6bfc-4041-abc2-8f4298584603-kube-api-access-whj6h\") pod \"community-operators-p6b9w\" (UID: \"6fd91972-6bfc-4041-abc2-8f4298584603\") " pod="openshift-marketplace/community-operators-p6b9w" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.361308 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p6b9w" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.378244 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.378346 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab5a35f7-19e5-4496-b030-59dfa49a64cf-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ab5a35f7-19e5-4496-b030-59dfa49a64cf\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.378382 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8333a325-229b-4dfd-a1f8-966f39bf55fc-utilities\") pod \"certified-operators-l7ngd\" (UID: \"8333a325-229b-4dfd-a1f8-966f39bf55fc\") " pod="openshift-marketplace/certified-operators-l7ngd" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.378442 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab5a35f7-19e5-4496-b030-59dfa49a64cf-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ab5a35f7-19e5-4496-b030-59dfa49a64cf\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.378500 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn884\" (UniqueName: \"kubernetes.io/projected/8333a325-229b-4dfd-a1f8-966f39bf55fc-kube-api-access-jn884\") pod \"certified-operators-l7ngd\" (UID: \"8333a325-229b-4dfd-a1f8-966f39bf55fc\") " pod="openshift-marketplace/certified-operators-l7ngd" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.378527 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab5a35f7-19e5-4496-b030-59dfa49a64cf-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ab5a35f7-19e5-4496-b030-59dfa49a64cf\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.378543 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8333a325-229b-4dfd-a1f8-966f39bf55fc-catalog-content\") pod \"certified-operators-l7ngd\" (UID: \"8333a325-229b-4dfd-a1f8-966f39bf55fc\") " pod="openshift-marketplace/certified-operators-l7ngd" Mar 01 09:11:43 crc kubenswrapper[4792]: E0301 09:11:43.378826 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:43.878810037 +0000 UTC m=+233.120689234 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.404223 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5bf56554b8-qv8hr"] Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.404845 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.421403 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.421657 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.421759 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.421918 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.422018 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.422184 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.429438 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab5a35f7-19e5-4496-b030-59dfa49a64cf-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ab5a35f7-19e5-4496-b030-59dfa49a64cf\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.448204 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44c647cb-a9e2-4e75-abb3-5d3cdbe881a2" path="/var/lib/kubelet/pods/44c647cb-a9e2-4e75-abb3-5d3cdbe881a2/volumes" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.448849 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8578f8dc-143c-423c-b62b-b3190444bafd" path="/var/lib/kubelet/pods/8578f8dc-143c-423c-b62b-b3190444bafd/volumes" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.449299 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5bf56554b8-qv8hr"] Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.450554 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.479873 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.480093 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8333a325-229b-4dfd-a1f8-966f39bf55fc-utilities\") pod \"certified-operators-l7ngd\" (UID: \"8333a325-229b-4dfd-a1f8-966f39bf55fc\") " pod="openshift-marketplace/certified-operators-l7ngd" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.480123 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fca3ad3-b093-4857-85cb-3db2b6516dcf-config\") pod \"controller-manager-5bf56554b8-qv8hr\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.480161 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fca3ad3-b093-4857-85cb-3db2b6516dcf-serving-cert\") pod \"controller-manager-5bf56554b8-qv8hr\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.480181 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn884\" (UniqueName: \"kubernetes.io/projected/8333a325-229b-4dfd-a1f8-966f39bf55fc-kube-api-access-jn884\") pod \"certified-operators-l7ngd\" (UID: \"8333a325-229b-4dfd-a1f8-966f39bf55fc\") " pod="openshift-marketplace/certified-operators-l7ngd" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.480206 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8333a325-229b-4dfd-a1f8-966f39bf55fc-catalog-content\") pod \"certified-operators-l7ngd\" (UID: \"8333a325-229b-4dfd-a1f8-966f39bf55fc\") " pod="openshift-marketplace/certified-operators-l7ngd" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.480230 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2fca3ad3-b093-4857-85cb-3db2b6516dcf-proxy-ca-bundles\") pod \"controller-manager-5bf56554b8-qv8hr\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.480259 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2fca3ad3-b093-4857-85cb-3db2b6516dcf-client-ca\") pod \"controller-manager-5bf56554b8-qv8hr\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.480329 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pp7l\" (UniqueName: \"kubernetes.io/projected/2fca3ad3-b093-4857-85cb-3db2b6516dcf-kube-api-access-4pp7l\") pod \"controller-manager-5bf56554b8-qv8hr\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:43 crc kubenswrapper[4792]: E0301 09:11:43.480454 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:43.980439749 +0000 UTC m=+233.222318946 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.480897 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8333a325-229b-4dfd-a1f8-966f39bf55fc-utilities\") pod \"certified-operators-l7ngd\" (UID: \"8333a325-229b-4dfd-a1f8-966f39bf55fc\") " pod="openshift-marketplace/certified-operators-l7ngd" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.481832 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8333a325-229b-4dfd-a1f8-966f39bf55fc-catalog-content\") pod \"certified-operators-l7ngd\" (UID: \"8333a325-229b-4dfd-a1f8-966f39bf55fc\") " pod="openshift-marketplace/certified-operators-l7ngd" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.508672 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn884\" (UniqueName: \"kubernetes.io/projected/8333a325-229b-4dfd-a1f8-966f39bf55fc-kube-api-access-jn884\") pod \"certified-operators-l7ngd\" (UID: \"8333a325-229b-4dfd-a1f8-966f39bf55fc\") " pod="openshift-marketplace/certified-operators-l7ngd" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.518234 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.562629 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l7ngd" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.581084 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2fca3ad3-b093-4857-85cb-3db2b6516dcf-proxy-ca-bundles\") pod \"controller-manager-5bf56554b8-qv8hr\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.581320 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.581373 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2fca3ad3-b093-4857-85cb-3db2b6516dcf-client-ca\") pod \"controller-manager-5bf56554b8-qv8hr\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.581409 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pp7l\" (UniqueName: \"kubernetes.io/projected/2fca3ad3-b093-4857-85cb-3db2b6516dcf-kube-api-access-4pp7l\") pod \"controller-manager-5bf56554b8-qv8hr\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.581453 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fca3ad3-b093-4857-85cb-3db2b6516dcf-config\") pod \"controller-manager-5bf56554b8-qv8hr\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.581483 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fca3ad3-b093-4857-85cb-3db2b6516dcf-serving-cert\") pod \"controller-manager-5bf56554b8-qv8hr\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.583287 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2fca3ad3-b093-4857-85cb-3db2b6516dcf-client-ca\") pod \"controller-manager-5bf56554b8-qv8hr\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.583827 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fca3ad3-b093-4857-85cb-3db2b6516dcf-config\") pod \"controller-manager-5bf56554b8-qv8hr\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:43 crc kubenswrapper[4792]: E0301 09:11:43.584106 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:44.084092601 +0000 UTC m=+233.325971798 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.604587 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2fca3ad3-b093-4857-85cb-3db2b6516dcf-proxy-ca-bundles\") pod \"controller-manager-5bf56554b8-qv8hr\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.607001 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.622372 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pp7l\" (UniqueName: \"kubernetes.io/projected/2fca3ad3-b093-4857-85cb-3db2b6516dcf-kube-api-access-4pp7l\") pod \"controller-manager-5bf56554b8-qv8hr\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.631881 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"84214f99543f0b1e0849c6f8c11a2101441281fafaac18eb335087b26b29526d"} Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.632021 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"0dbc10d9857ea2b6db4cc8848cb7c7653c840c463dc02da4147a3da7ef6a9a30"} Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.648607 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.651931 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.661654 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fca3ad3-b093-4857-85cb-3db2b6516dcf-serving-cert\") pod \"controller-manager-5bf56554b8-qv8hr\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.666201 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:43 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:43 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:43 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.666262 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.683625 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:43 crc kubenswrapper[4792]: E0301 09:11:43.683865 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:44.183828616 +0000 UTC m=+233.425707813 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.684287 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.684664 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:43 crc kubenswrapper[4792]: E0301 09:11:43.688094 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:44.18806828 +0000 UTC m=+233.429947477 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.733796 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"9bfd625c54787a996c945bf828a16cb9a2d42053182e44eae51137d2a911fd8e"} Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.757047 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"702340f9d880baf765b9e70f1d41d931dc8733c24078f88256d94345792f4b16"} Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.757107 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"61a3e72aa73e0fa3505422f8f0578d5003513bebae5119711655632907b066b5"} Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.757576 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.763377 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.767365 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-64dsw" event={"ID":"9b2af767-57b1-4774-9668-6610e9ac1bb9","Type":"ContainerStarted","Data":"7e2a93d4148b8e054c7f0ca5a48dbf78d0e0ace6e87d813fc0b575cc94d8e3cc"} Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.787473 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wxb87"] Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.788089 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:43 crc kubenswrapper[4792]: E0301 09:11:43.788864 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:44.288852542 +0000 UTC m=+233.530731739 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.827591 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.889687 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:43 crc kubenswrapper[4792]: E0301 09:11:43.892453 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:44.392425921 +0000 UTC m=+233.634305118 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:43 crc kubenswrapper[4792]: I0301 09:11:43.991312 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:43 crc kubenswrapper[4792]: E0301 09:11:43.991704 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:44.491688615 +0000 UTC m=+233.733567812 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.064920 4792 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.079788 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cw675"] Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.093034 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:44 crc kubenswrapper[4792]: E0301 09:11:44.093383 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:44.593371508 +0000 UTC m=+233.835250705 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.117491 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.194664 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:44 crc kubenswrapper[4792]: E0301 09:11:44.195013 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:44.69499877 +0000 UTC m=+233.936877967 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.236418 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l7ngd"] Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.288386 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p6b9w"] Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.294094 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5bf56554b8-qv8hr"] Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.296497 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:44 crc kubenswrapper[4792]: E0301 09:11:44.296801 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:44.796788296 +0000 UTC m=+234.038667493 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.397658 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:44 crc kubenswrapper[4792]: E0301 09:11:44.397979 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:44.897950987 +0000 UTC m=+234.139830184 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.398226 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:44 crc kubenswrapper[4792]: E0301 09:11:44.398565 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:44.898553701 +0000 UTC m=+234.140432898 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.506340 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:44 crc kubenswrapper[4792]: E0301 09:11:44.506731 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:45.006713964 +0000 UTC m=+234.248593161 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.606187 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:44 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:44 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:44 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.606280 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.607420 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:44 crc kubenswrapper[4792]: E0301 09:11:44.607808 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:45.107758062 +0000 UTC m=+234.349637259 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.708556 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:44 crc kubenswrapper[4792]: E0301 09:11:44.708700 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:45.208668076 +0000 UTC m=+234.450547273 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.709009 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:44 crc kubenswrapper[4792]: E0301 09:11:44.709372 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:45.209363993 +0000 UTC m=+234.451243190 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.780218 4792 generic.go:334] "Generic (PLEG): container finished" podID="6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3" containerID="db93ce9b530ac529df661a0d9b5aa2418392875dd62f9aecf46dc33ff7dfd43c" exitCode=0 Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.780293 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn" event={"ID":"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3","Type":"ContainerDied","Data":"db93ce9b530ac529df661a0d9b5aa2418392875dd62f9aecf46dc33ff7dfd43c"} Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.786098 4792 generic.go:334] "Generic (PLEG): container finished" podID="dff0d675-52dd-4cac-a7be-8750333c28e3" containerID="2793d9d53a76fa1bceb0317b96453e19780c382a91f9c7fd6f680978a1c2a121" exitCode=0 Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.786138 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cw675" event={"ID":"dff0d675-52dd-4cac-a7be-8750333c28e3","Type":"ContainerDied","Data":"2793d9d53a76fa1bceb0317b96453e19780c382a91f9c7fd6f680978a1c2a121"} Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.786184 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cw675" event={"ID":"dff0d675-52dd-4cac-a7be-8750333c28e3","Type":"ContainerStarted","Data":"339bd57c1cc13f4119c941d59da1a1ea961d6edda5147620d400c9a04f2e8ceb"} Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.788644 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ab5a35f7-19e5-4496-b030-59dfa49a64cf","Type":"ContainerStarted","Data":"671fb79672e1adda67b9686140cc8c8e23feef6fc30c770412920df0821a0f37"} Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.788674 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ab5a35f7-19e5-4496-b030-59dfa49a64cf","Type":"ContainerStarted","Data":"9b3b8533ecddfd3cef9d38d86a78b540d4a7fe462e6ed508f6f81c47012cffc7"} Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.793660 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" event={"ID":"2fca3ad3-b093-4857-85cb-3db2b6516dcf","Type":"ContainerStarted","Data":"5e459860652998dbc975fd66e05ff3eab25257081065483e039f02fb979238f7"} Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.793707 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" event={"ID":"2fca3ad3-b093-4857-85cb-3db2b6516dcf","Type":"ContainerStarted","Data":"834122f1bb558979716c9151687e893e13e6120e36e39817e6beb2432dd0f0fd"} Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.794022 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.800532 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-64dsw" event={"ID":"9b2af767-57b1-4774-9668-6610e9ac1bb9","Type":"ContainerStarted","Data":"ad6fbd3af39fd89c1a3dcc34e6e3ab5110eacb45dc7bd773cb3231d5c336f69a"} Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.801557 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.802798 4792 generic.go:334] "Generic (PLEG): container finished" podID="6fd91972-6bfc-4041-abc2-8f4298584603" containerID="aa50e0a49246a84ff0433571a1c37df20be8eda2033b2980a1562df09196f210" exitCode=0 Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.802861 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6b9w" event={"ID":"6fd91972-6bfc-4041-abc2-8f4298584603","Type":"ContainerDied","Data":"aa50e0a49246a84ff0433571a1c37df20be8eda2033b2980a1562df09196f210"} Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.802884 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6b9w" event={"ID":"6fd91972-6bfc-4041-abc2-8f4298584603","Type":"ContainerStarted","Data":"477ff8111ce50f202ac20e7801f2ac3bd82c5ed546b9e6aeee69367ec0d09908"} Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.806673 4792 generic.go:334] "Generic (PLEG): container finished" podID="9073e3da-2d6f-48a3-907a-e347f28559ae" containerID="cabcf6681e53e945ec2b29b89c123577ba10471c42575321a6f1da2be37ffe2a" exitCode=0 Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.806734 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wxb87" event={"ID":"9073e3da-2d6f-48a3-907a-e347f28559ae","Type":"ContainerDied","Data":"cabcf6681e53e945ec2b29b89c123577ba10471c42575321a6f1da2be37ffe2a"} Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.806760 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wxb87" event={"ID":"9073e3da-2d6f-48a3-907a-e347f28559ae","Type":"ContainerStarted","Data":"f1bc5d15012443ccc9990bce32da439c2d26cd8bac7e62fa7b81593bfe925710"} Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.808722 4792 generic.go:334] "Generic (PLEG): container finished" podID="8333a325-229b-4dfd-a1f8-966f39bf55fc" containerID="ea0440bd6858d820e5ab5d60d7085504804067b9ef42039aabaf07d6f0cd7730" exitCode=0 Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.808800 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l7ngd" event={"ID":"8333a325-229b-4dfd-a1f8-966f39bf55fc","Type":"ContainerDied","Data":"ea0440bd6858d820e5ab5d60d7085504804067b9ef42039aabaf07d6f0cd7730"} Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.808835 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l7ngd" event={"ID":"8333a325-229b-4dfd-a1f8-966f39bf55fc","Type":"ContainerStarted","Data":"90fafa3ed5c527a4e521d0e12598d211c5be0e011b432479d9a243fe086b9d81"} Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.809450 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:44 crc kubenswrapper[4792]: E0301 09:11:44.809562 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-01 09:11:45.309537229 +0000 UTC m=+234.551416426 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.809693 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:44 crc kubenswrapper[4792]: E0301 09:11:44.810010 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-01 09:11:45.30999858 +0000 UTC m=+234.551877777 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4jwnr" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.816842 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"4e25f294f98231b81432bed1d2a4d2d921bf0ecad4575f5829c0b9e753030adb"} Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.823080 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.823917 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.824491 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-n28r8"] Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.827100 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-977d4" Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.827213 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n28r8" Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.833375 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.839119 4792 patch_prober.go:28] interesting pod/apiserver-76f77b778f-6lk5b container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 01 09:11:44 crc kubenswrapper[4792]: [+]log ok Mar 01 09:11:44 crc kubenswrapper[4792]: [+]etcd ok Mar 01 09:11:44 crc kubenswrapper[4792]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 01 09:11:44 crc kubenswrapper[4792]: [+]poststarthook/generic-apiserver-start-informers ok Mar 01 09:11:44 crc kubenswrapper[4792]: [+]poststarthook/max-in-flight-filter ok Mar 01 09:11:44 crc kubenswrapper[4792]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 01 09:11:44 crc kubenswrapper[4792]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 01 09:11:44 crc kubenswrapper[4792]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 01 09:11:44 crc kubenswrapper[4792]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 01 09:11:44 crc kubenswrapper[4792]: [+]poststarthook/project.openshift.io-projectcache ok Mar 01 09:11:44 crc kubenswrapper[4792]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 01 09:11:44 crc kubenswrapper[4792]: [+]poststarthook/openshift.io-startinformers ok Mar 01 09:11:44 crc kubenswrapper[4792]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 01 09:11:44 crc kubenswrapper[4792]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 01 09:11:44 crc kubenswrapper[4792]: livez check failed Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.839182 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" podUID="499393fc-abcf-4998-9e32-3d43a0b1e488" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.847358 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-64dsw" podStartSLOduration=13.84733928 podStartE2EDuration="13.84733928s" podCreationTimestamp="2026-03-01 09:11:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:44.844842348 +0000 UTC m=+234.086721555" watchObservedRunningTime="2026-03-01 09:11:44.84733928 +0000 UTC m=+234.089218477" Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.852454 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n28r8"] Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.860341 4792 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-01T09:11:44.064951459Z","Handler":null,"Name":""} Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.869671 4792 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.869703 4792 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.901928 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=1.9018922630000001 podStartE2EDuration="1.901892263s" podCreationTimestamp="2026-03-01 09:11:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:44.861461837 +0000 UTC m=+234.103341034" watchObservedRunningTime="2026-03-01 09:11:44.901892263 +0000 UTC m=+234.143771470" Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.910435 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.911331 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc5gl\" (UniqueName: \"kubernetes.io/projected/e03dee1d-d7ca-422c-8af6-faa0c1af3863-kube-api-access-kc5gl\") pod \"redhat-marketplace-n28r8\" (UID: \"e03dee1d-d7ca-422c-8af6-faa0c1af3863\") " pod="openshift-marketplace/redhat-marketplace-n28r8" Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.911695 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e03dee1d-d7ca-422c-8af6-faa0c1af3863-catalog-content\") pod \"redhat-marketplace-n28r8\" (UID: \"e03dee1d-d7ca-422c-8af6-faa0c1af3863\") " pod="openshift-marketplace/redhat-marketplace-n28r8" Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.911899 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e03dee1d-d7ca-422c-8af6-faa0c1af3863-utilities\") pod \"redhat-marketplace-n28r8\" (UID: \"e03dee1d-d7ca-422c-8af6-faa0c1af3863\") " pod="openshift-marketplace/redhat-marketplace-n28r8" Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.961335 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.964050 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" podStartSLOduration=5.964032602 podStartE2EDuration="5.964032602s" podCreationTimestamp="2026-03-01 09:11:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:44.96230743 +0000 UTC m=+234.204186627" watchObservedRunningTime="2026-03-01 09:11:44.964032602 +0000 UTC m=+234.205911799" Mar 01 09:11:44 crc kubenswrapper[4792]: I0301 09:11:44.979855 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.013952 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc5gl\" (UniqueName: \"kubernetes.io/projected/e03dee1d-d7ca-422c-8af6-faa0c1af3863-kube-api-access-kc5gl\") pod \"redhat-marketplace-n28r8\" (UID: \"e03dee1d-d7ca-422c-8af6-faa0c1af3863\") " pod="openshift-marketplace/redhat-marketplace-n28r8" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.014020 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.014067 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e03dee1d-d7ca-422c-8af6-faa0c1af3863-catalog-content\") pod \"redhat-marketplace-n28r8\" (UID: \"e03dee1d-d7ca-422c-8af6-faa0c1af3863\") " pod="openshift-marketplace/redhat-marketplace-n28r8" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.014116 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e03dee1d-d7ca-422c-8af6-faa0c1af3863-utilities\") pod \"redhat-marketplace-n28r8\" (UID: \"e03dee1d-d7ca-422c-8af6-faa0c1af3863\") " pod="openshift-marketplace/redhat-marketplace-n28r8" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.014559 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e03dee1d-d7ca-422c-8af6-faa0c1af3863-utilities\") pod \"redhat-marketplace-n28r8\" (UID: \"e03dee1d-d7ca-422c-8af6-faa0c1af3863\") " pod="openshift-marketplace/redhat-marketplace-n28r8" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.015692 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e03dee1d-d7ca-422c-8af6-faa0c1af3863-catalog-content\") pod \"redhat-marketplace-n28r8\" (UID: \"e03dee1d-d7ca-422c-8af6-faa0c1af3863\") " pod="openshift-marketplace/redhat-marketplace-n28r8" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.048268 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc5gl\" (UniqueName: \"kubernetes.io/projected/e03dee1d-d7ca-422c-8af6-faa0c1af3863-kube-api-access-kc5gl\") pod \"redhat-marketplace-n28r8\" (UID: \"e03dee1d-d7ca-422c-8af6-faa0c1af3863\") " pod="openshift-marketplace/redhat-marketplace-n28r8" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.146120 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n28r8" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.218892 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zwt8t"] Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.219871 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zwt8t" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.224436 4792 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.224532 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.251397 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zwt8t"] Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.320465 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjlvw\" (UniqueName: \"kubernetes.io/projected/fee8fc8f-8d72-4606-b115-4197f599cfcb-kube-api-access-mjlvw\") pod \"redhat-marketplace-zwt8t\" (UID: \"fee8fc8f-8d72-4606-b115-4197f599cfcb\") " pod="openshift-marketplace/redhat-marketplace-zwt8t" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.320546 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fee8fc8f-8d72-4606-b115-4197f599cfcb-utilities\") pod \"redhat-marketplace-zwt8t\" (UID: \"fee8fc8f-8d72-4606-b115-4197f599cfcb\") " pod="openshift-marketplace/redhat-marketplace-zwt8t" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.320571 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fee8fc8f-8d72-4606-b115-4197f599cfcb-catalog-content\") pod \"redhat-marketplace-zwt8t\" (UID: \"fee8fc8f-8d72-4606-b115-4197f599cfcb\") " pod="openshift-marketplace/redhat-marketplace-zwt8t" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.410541 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4jwnr\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.422552 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjlvw\" (UniqueName: \"kubernetes.io/projected/fee8fc8f-8d72-4606-b115-4197f599cfcb-kube-api-access-mjlvw\") pod \"redhat-marketplace-zwt8t\" (UID: \"fee8fc8f-8d72-4606-b115-4197f599cfcb\") " pod="openshift-marketplace/redhat-marketplace-zwt8t" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.422624 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fee8fc8f-8d72-4606-b115-4197f599cfcb-utilities\") pod \"redhat-marketplace-zwt8t\" (UID: \"fee8fc8f-8d72-4606-b115-4197f599cfcb\") " pod="openshift-marketplace/redhat-marketplace-zwt8t" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.422647 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fee8fc8f-8d72-4606-b115-4197f599cfcb-catalog-content\") pod \"redhat-marketplace-zwt8t\" (UID: \"fee8fc8f-8d72-4606-b115-4197f599cfcb\") " pod="openshift-marketplace/redhat-marketplace-zwt8t" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.423107 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fee8fc8f-8d72-4606-b115-4197f599cfcb-catalog-content\") pod \"redhat-marketplace-zwt8t\" (UID: \"fee8fc8f-8d72-4606-b115-4197f599cfcb\") " pod="openshift-marketplace/redhat-marketplace-zwt8t" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.423322 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fee8fc8f-8d72-4606-b115-4197f599cfcb-utilities\") pod \"redhat-marketplace-zwt8t\" (UID: \"fee8fc8f-8d72-4606-b115-4197f599cfcb\") " pod="openshift-marketplace/redhat-marketplace-zwt8t" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.454539 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.463754 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjlvw\" (UniqueName: \"kubernetes.io/projected/fee8fc8f-8d72-4606-b115-4197f599cfcb-kube-api-access-mjlvw\") pod \"redhat-marketplace-zwt8t\" (UID: \"fee8fc8f-8d72-4606-b115-4197f599cfcb\") " pod="openshift-marketplace/redhat-marketplace-zwt8t" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.542072 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zwt8t" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.553957 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.607299 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:45 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:45 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:45 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.607352 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.654928 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n28r8"] Mar 01 09:11:45 crc kubenswrapper[4792]: W0301 09:11:45.683043 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode03dee1d_d7ca_422c_8af6_faa0c1af3863.slice/crio-385e5f3c0c4fe34a9ca54a291cae71ccbb846b542d87341115b79874eadd771a WatchSource:0}: Error finding container 385e5f3c0c4fe34a9ca54a291cae71ccbb846b542d87341115b79874eadd771a: Status 404 returned error can't find the container with id 385e5f3c0c4fe34a9ca54a291cae71ccbb846b542d87341115b79874eadd771a Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.837952 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7nwc2"] Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.839163 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7nwc2" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.848217 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.855606 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n28r8" event={"ID":"e03dee1d-d7ca-422c-8af6-faa0c1af3863","Type":"ContainerStarted","Data":"385e5f3c0c4fe34a9ca54a291cae71ccbb846b542d87341115b79874eadd771a"} Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.859150 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7nwc2"] Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.866792 4792 generic.go:334] "Generic (PLEG): container finished" podID="ab5a35f7-19e5-4496-b030-59dfa49a64cf" containerID="671fb79672e1adda67b9686140cc8c8e23feef6fc30c770412920df0821a0f37" exitCode=0 Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.866929 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ab5a35f7-19e5-4496-b030-59dfa49a64cf","Type":"ContainerDied","Data":"671fb79672e1adda67b9686140cc8c8e23feef6fc30c770412920df0821a0f37"} Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.933723 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22c7c368-3523-4224-aebd-59b29640bed0-utilities\") pod \"redhat-operators-7nwc2\" (UID: \"22c7c368-3523-4224-aebd-59b29640bed0\") " pod="openshift-marketplace/redhat-operators-7nwc2" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.933805 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djdkr\" (UniqueName: \"kubernetes.io/projected/22c7c368-3523-4224-aebd-59b29640bed0-kube-api-access-djdkr\") pod \"redhat-operators-7nwc2\" (UID: \"22c7c368-3523-4224-aebd-59b29640bed0\") " pod="openshift-marketplace/redhat-operators-7nwc2" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.934103 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22c7c368-3523-4224-aebd-59b29640bed0-catalog-content\") pod \"redhat-operators-7nwc2\" (UID: \"22c7c368-3523-4224-aebd-59b29640bed0\") " pod="openshift-marketplace/redhat-operators-7nwc2" Mar 01 09:11:45 crc kubenswrapper[4792]: I0301 09:11:45.980536 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zwt8t"] Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.037471 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22c7c368-3523-4224-aebd-59b29640bed0-utilities\") pod \"redhat-operators-7nwc2\" (UID: \"22c7c368-3523-4224-aebd-59b29640bed0\") " pod="openshift-marketplace/redhat-operators-7nwc2" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.037502 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djdkr\" (UniqueName: \"kubernetes.io/projected/22c7c368-3523-4224-aebd-59b29640bed0-kube-api-access-djdkr\") pod \"redhat-operators-7nwc2\" (UID: \"22c7c368-3523-4224-aebd-59b29640bed0\") " pod="openshift-marketplace/redhat-operators-7nwc2" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.037557 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22c7c368-3523-4224-aebd-59b29640bed0-catalog-content\") pod \"redhat-operators-7nwc2\" (UID: \"22c7c368-3523-4224-aebd-59b29640bed0\") " pod="openshift-marketplace/redhat-operators-7nwc2" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.037927 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22c7c368-3523-4224-aebd-59b29640bed0-catalog-content\") pod \"redhat-operators-7nwc2\" (UID: \"22c7c368-3523-4224-aebd-59b29640bed0\") " pod="openshift-marketplace/redhat-operators-7nwc2" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.038135 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22c7c368-3523-4224-aebd-59b29640bed0-utilities\") pod \"redhat-operators-7nwc2\" (UID: \"22c7c368-3523-4224-aebd-59b29640bed0\") " pod="openshift-marketplace/redhat-operators-7nwc2" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.092155 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djdkr\" (UniqueName: \"kubernetes.io/projected/22c7c368-3523-4224-aebd-59b29640bed0-kube-api-access-djdkr\") pod \"redhat-operators-7nwc2\" (UID: \"22c7c368-3523-4224-aebd-59b29640bed0\") " pod="openshift-marketplace/redhat-operators-7nwc2" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.110798 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4jwnr"] Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.113475 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.114270 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.123897 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.124346 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 01 09:11:46 crc kubenswrapper[4792]: W0301 09:11:46.136952 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf147eb3a_0f65_4ecb_b1a2_5d561c21253c.slice/crio-21cb9ec22cbd36ee1804b1e2f96533e30f8509709de84a1859f63834b8df4b3f WatchSource:0}: Error finding container 21cb9ec22cbd36ee1804b1e2f96533e30f8509709de84a1859f63834b8df4b3f: Status 404 returned error can't find the container with id 21cb9ec22cbd36ee1804b1e2f96533e30f8509709de84a1859f63834b8df4b3f Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.188312 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7nwc2" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.190767 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.242954 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9a193ef8-bf31-4d61-972e-5772b8fe8c39-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9a193ef8-bf31-4d61-972e-5772b8fe8c39\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.243229 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9a193ef8-bf31-4d61-972e-5772b8fe8c39-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9a193ef8-bf31-4d61-972e-5772b8fe8c39\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.275123 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ftk4v"] Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.276281 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ftk4v" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.322501 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ftk4v"] Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.344602 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e93e87c0-86c5-446d-9f43-71d17960a351-utilities\") pod \"redhat-operators-ftk4v\" (UID: \"e93e87c0-86c5-446d-9f43-71d17960a351\") " pod="openshift-marketplace/redhat-operators-ftk4v" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.344667 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9a193ef8-bf31-4d61-972e-5772b8fe8c39-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9a193ef8-bf31-4d61-972e-5772b8fe8c39\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.344701 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tbnw\" (UniqueName: \"kubernetes.io/projected/e93e87c0-86c5-446d-9f43-71d17960a351-kube-api-access-4tbnw\") pod \"redhat-operators-ftk4v\" (UID: \"e93e87c0-86c5-446d-9f43-71d17960a351\") " pod="openshift-marketplace/redhat-operators-ftk4v" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.344721 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e93e87c0-86c5-446d-9f43-71d17960a351-catalog-content\") pod \"redhat-operators-ftk4v\" (UID: \"e93e87c0-86c5-446d-9f43-71d17960a351\") " pod="openshift-marketplace/redhat-operators-ftk4v" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.344741 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9a193ef8-bf31-4d61-972e-5772b8fe8c39-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9a193ef8-bf31-4d61-972e-5772b8fe8c39\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.344834 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9a193ef8-bf31-4d61-972e-5772b8fe8c39-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9a193ef8-bf31-4d61-972e-5772b8fe8c39\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.403645 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9a193ef8-bf31-4d61-972e-5772b8fe8c39-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9a193ef8-bf31-4d61-972e-5772b8fe8c39\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.447531 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e93e87c0-86c5-446d-9f43-71d17960a351-utilities\") pod \"redhat-operators-ftk4v\" (UID: \"e93e87c0-86c5-446d-9f43-71d17960a351\") " pod="openshift-marketplace/redhat-operators-ftk4v" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.447603 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tbnw\" (UniqueName: \"kubernetes.io/projected/e93e87c0-86c5-446d-9f43-71d17960a351-kube-api-access-4tbnw\") pod \"redhat-operators-ftk4v\" (UID: \"e93e87c0-86c5-446d-9f43-71d17960a351\") " pod="openshift-marketplace/redhat-operators-ftk4v" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.447626 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e93e87c0-86c5-446d-9f43-71d17960a351-catalog-content\") pod \"redhat-operators-ftk4v\" (UID: \"e93e87c0-86c5-446d-9f43-71d17960a351\") " pod="openshift-marketplace/redhat-operators-ftk4v" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.448068 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e93e87c0-86c5-446d-9f43-71d17960a351-catalog-content\") pod \"redhat-operators-ftk4v\" (UID: \"e93e87c0-86c5-446d-9f43-71d17960a351\") " pod="openshift-marketplace/redhat-operators-ftk4v" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.448298 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e93e87c0-86c5-446d-9f43-71d17960a351-utilities\") pod \"redhat-operators-ftk4v\" (UID: \"e93e87c0-86c5-446d-9f43-71d17960a351\") " pod="openshift-marketplace/redhat-operators-ftk4v" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.463058 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.500884 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.536515 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tbnw\" (UniqueName: \"kubernetes.io/projected/e93e87c0-86c5-446d-9f43-71d17960a351-kube-api-access-4tbnw\") pod \"redhat-operators-ftk4v\" (UID: \"e93e87c0-86c5-446d-9f43-71d17960a351\") " pod="openshift-marketplace/redhat-operators-ftk4v" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.618162 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:46 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:46 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:46 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.618209 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.622242 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ftk4v" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.650288 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3-config-volume\") pod \"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3\" (UID: \"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3\") " Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.650539 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3-secret-volume\") pod \"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3\" (UID: \"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3\") " Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.650675 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzw4q\" (UniqueName: \"kubernetes.io/projected/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3-kube-api-access-lzw4q\") pod \"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3\" (UID: \"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3\") " Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.651733 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3-config-volume" (OuterVolumeSpecName: "config-volume") pod "6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3" (UID: "6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.674377 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3-kube-api-access-lzw4q" (OuterVolumeSpecName: "kube-api-access-lzw4q") pod "6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3" (UID: "6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3"). InnerVolumeSpecName "kube-api-access-lzw4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.677763 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3" (UID: "6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.752548 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzw4q\" (UniqueName: \"kubernetes.io/projected/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3-kube-api-access-lzw4q\") on node \"crc\" DevicePath \"\"" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.752583 4792 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3-config-volume\") on node \"crc\" DevicePath \"\"" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.752592 4792 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.889784 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn" event={"ID":"6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3","Type":"ContainerDied","Data":"e332fd58bac36920eb992b11914f683b07598dd8216c452a72973706723d6019"} Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.889819 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e332fd58bac36920eb992b11914f683b07598dd8216c452a72973706723d6019" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.889890 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.936145 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7nwc2"] Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.961795 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" event={"ID":"f147eb3a-0f65-4ecb-b1a2-5d561c21253c","Type":"ContainerStarted","Data":"0779d989a019df4f783c6ed27e1b08237b43aef3c94320d704c90c5367c173ef"} Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.962178 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" event={"ID":"f147eb3a-0f65-4ecb-b1a2-5d561c21253c","Type":"ContainerStarted","Data":"21cb9ec22cbd36ee1804b1e2f96533e30f8509709de84a1859f63834b8df4b3f"} Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.962215 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.995837 4792 generic.go:334] "Generic (PLEG): container finished" podID="fee8fc8f-8d72-4606-b115-4197f599cfcb" containerID="e49f509c95cf50378f216231453383d3c1c559a790286ac1d1a0c0a75d1546f4" exitCode=0 Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.996049 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zwt8t" event={"ID":"fee8fc8f-8d72-4606-b115-4197f599cfcb","Type":"ContainerDied","Data":"e49f509c95cf50378f216231453383d3c1c559a790286ac1d1a0c0a75d1546f4"} Mar 01 09:11:46 crc kubenswrapper[4792]: I0301 09:11:46.996081 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zwt8t" event={"ID":"fee8fc8f-8d72-4606-b115-4197f599cfcb","Type":"ContainerStarted","Data":"6a4a6d1ca04b5a791e8cc232ac1bcb86844593ceb569a67c479055eb35e8caec"} Mar 01 09:11:47 crc kubenswrapper[4792]: I0301 09:11:47.000797 4792 generic.go:334] "Generic (PLEG): container finished" podID="e03dee1d-d7ca-422c-8af6-faa0c1af3863" containerID="0c84982bc61501954eca9c9293432f6ed8f745c671471318271733558785ba39" exitCode=0 Mar 01 09:11:47 crc kubenswrapper[4792]: I0301 09:11:47.002445 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n28r8" event={"ID":"e03dee1d-d7ca-422c-8af6-faa0c1af3863","Type":"ContainerDied","Data":"0c84982bc61501954eca9c9293432f6ed8f745c671471318271733558785ba39"} Mar 01 09:11:47 crc kubenswrapper[4792]: I0301 09:11:47.018609 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" podStartSLOduration=175.018591181 podStartE2EDuration="2m55.018591181s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:11:47.017280019 +0000 UTC m=+236.259159246" watchObservedRunningTime="2026-03-01 09:11:47.018591181 +0000 UTC m=+236.260470378" Mar 01 09:11:47 crc kubenswrapper[4792]: W0301 09:11:47.034642 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22c7c368_3523_4224_aebd_59b29640bed0.slice/crio-52cac0b41f9f5231fc4b99f244ac1d2bd2e66edf0a6640e4bac8c95c340ce560 WatchSource:0}: Error finding container 52cac0b41f9f5231fc4b99f244ac1d2bd2e66edf0a6640e4bac8c95c340ce560: Status 404 returned error can't find the container with id 52cac0b41f9f5231fc4b99f244ac1d2bd2e66edf0a6640e4bac8c95c340ce560 Mar 01 09:11:47 crc kubenswrapper[4792]: I0301 09:11:47.141246 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 01 09:11:47 crc kubenswrapper[4792]: I0301 09:11:47.468640 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ftk4v"] Mar 01 09:11:47 crc kubenswrapper[4792]: I0301 09:11:47.615715 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:47 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:47 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:47 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:47 crc kubenswrapper[4792]: I0301 09:11:47.616027 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:47 crc kubenswrapper[4792]: I0301 09:11:47.724366 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 01 09:11:47 crc kubenswrapper[4792]: I0301 09:11:47.779679 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab5a35f7-19e5-4496-b030-59dfa49a64cf-kube-api-access\") pod \"ab5a35f7-19e5-4496-b030-59dfa49a64cf\" (UID: \"ab5a35f7-19e5-4496-b030-59dfa49a64cf\") " Mar 01 09:11:47 crc kubenswrapper[4792]: I0301 09:11:47.779735 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab5a35f7-19e5-4496-b030-59dfa49a64cf-kubelet-dir\") pod \"ab5a35f7-19e5-4496-b030-59dfa49a64cf\" (UID: \"ab5a35f7-19e5-4496-b030-59dfa49a64cf\") " Mar 01 09:11:47 crc kubenswrapper[4792]: I0301 09:11:47.780125 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab5a35f7-19e5-4496-b030-59dfa49a64cf-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ab5a35f7-19e5-4496-b030-59dfa49a64cf" (UID: "ab5a35f7-19e5-4496-b030-59dfa49a64cf"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:11:47 crc kubenswrapper[4792]: I0301 09:11:47.788165 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab5a35f7-19e5-4496-b030-59dfa49a64cf-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ab5a35f7-19e5-4496-b030-59dfa49a64cf" (UID: "ab5a35f7-19e5-4496-b030-59dfa49a64cf"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:11:47 crc kubenswrapper[4792]: I0301 09:11:47.882311 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab5a35f7-19e5-4496-b030-59dfa49a64cf-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 01 09:11:47 crc kubenswrapper[4792]: I0301 09:11:47.882375 4792 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab5a35f7-19e5-4496-b030-59dfa49a64cf-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 01 09:11:48 crc kubenswrapper[4792]: I0301 09:11:48.020183 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9a193ef8-bf31-4d61-972e-5772b8fe8c39","Type":"ContainerStarted","Data":"8c3c0f18faf3dfd4f659a530a5e1444bf2456ab64ec87d38ae446eb8262bf116"} Mar 01 09:11:48 crc kubenswrapper[4792]: I0301 09:11:48.026844 4792 ???:1] "http: TLS handshake error from 192.168.126.11:60374: no serving certificate available for the kubelet" Mar 01 09:11:48 crc kubenswrapper[4792]: I0301 09:11:48.033433 4792 generic.go:334] "Generic (PLEG): container finished" podID="22c7c368-3523-4224-aebd-59b29640bed0" containerID="821167de69ed6dbed932952dc938c85a7e5a7b97b04ac4c8184f76750cd36e46" exitCode=0 Mar 01 09:11:48 crc kubenswrapper[4792]: I0301 09:11:48.033615 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7nwc2" event={"ID":"22c7c368-3523-4224-aebd-59b29640bed0","Type":"ContainerDied","Data":"821167de69ed6dbed932952dc938c85a7e5a7b97b04ac4c8184f76750cd36e46"} Mar 01 09:11:48 crc kubenswrapper[4792]: I0301 09:11:48.033762 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7nwc2" event={"ID":"22c7c368-3523-4224-aebd-59b29640bed0","Type":"ContainerStarted","Data":"52cac0b41f9f5231fc4b99f244ac1d2bd2e66edf0a6640e4bac8c95c340ce560"} Mar 01 09:11:48 crc kubenswrapper[4792]: I0301 09:11:48.044851 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ab5a35f7-19e5-4496-b030-59dfa49a64cf","Type":"ContainerDied","Data":"9b3b8533ecddfd3cef9d38d86a78b540d4a7fe462e6ed508f6f81c47012cffc7"} Mar 01 09:11:48 crc kubenswrapper[4792]: I0301 09:11:48.044923 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b3b8533ecddfd3cef9d38d86a78b540d4a7fe462e6ed508f6f81c47012cffc7" Mar 01 09:11:48 crc kubenswrapper[4792]: I0301 09:11:48.044935 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 01 09:11:48 crc kubenswrapper[4792]: I0301 09:11:48.067673 4792 generic.go:334] "Generic (PLEG): container finished" podID="e93e87c0-86c5-446d-9f43-71d17960a351" containerID="c36d8345eeaa99ca3b40ff37727a6e09aab9d36c26bc0292d9c0f3405bd1fe17" exitCode=0 Mar 01 09:11:48 crc kubenswrapper[4792]: I0301 09:11:48.068839 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ftk4v" event={"ID":"e93e87c0-86c5-446d-9f43-71d17960a351","Type":"ContainerDied","Data":"c36d8345eeaa99ca3b40ff37727a6e09aab9d36c26bc0292d9c0f3405bd1fe17"} Mar 01 09:11:48 crc kubenswrapper[4792]: I0301 09:11:48.068863 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ftk4v" event={"ID":"e93e87c0-86c5-446d-9f43-71d17960a351","Type":"ContainerStarted","Data":"4d930733fc56c620ebbeeb1e0668d704b5df033f141d5439fd47f1e4757bacb0"} Mar 01 09:11:48 crc kubenswrapper[4792]: I0301 09:11:48.606411 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:48 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:48 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:48 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:48 crc kubenswrapper[4792]: I0301 09:11:48.606469 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:49 crc kubenswrapper[4792]: I0301 09:11:49.117615 4792 generic.go:334] "Generic (PLEG): container finished" podID="9a193ef8-bf31-4d61-972e-5772b8fe8c39" containerID="c4d130b61a49d2b6adb269d65f6c6f670dd6878b17ef19970b09e3cdeea1a62b" exitCode=0 Mar 01 09:11:49 crc kubenswrapper[4792]: I0301 09:11:49.117694 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9a193ef8-bf31-4d61-972e-5772b8fe8c39","Type":"ContainerDied","Data":"c4d130b61a49d2b6adb269d65f6c6f670dd6878b17ef19970b09e3cdeea1a62b"} Mar 01 09:11:49 crc kubenswrapper[4792]: I0301 09:11:49.607759 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:49 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:49 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:49 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:49 crc kubenswrapper[4792]: I0301 09:11:49.607831 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:49 crc kubenswrapper[4792]: I0301 09:11:49.829087 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:49 crc kubenswrapper[4792]: I0301 09:11:49.833801 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-6lk5b" Mar 01 09:11:49 crc kubenswrapper[4792]: I0301 09:11:49.989194 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-rjwhk" Mar 01 09:11:50 crc kubenswrapper[4792]: I0301 09:11:50.606395 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:50 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:50 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:50 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:50 crc kubenswrapper[4792]: I0301 09:11:50.606461 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:50 crc kubenswrapper[4792]: I0301 09:11:50.667658 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 01 09:11:50 crc kubenswrapper[4792]: I0301 09:11:50.740659 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9a193ef8-bf31-4d61-972e-5772b8fe8c39-kube-api-access\") pod \"9a193ef8-bf31-4d61-972e-5772b8fe8c39\" (UID: \"9a193ef8-bf31-4d61-972e-5772b8fe8c39\") " Mar 01 09:11:50 crc kubenswrapper[4792]: I0301 09:11:50.740704 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9a193ef8-bf31-4d61-972e-5772b8fe8c39-kubelet-dir\") pod \"9a193ef8-bf31-4d61-972e-5772b8fe8c39\" (UID: \"9a193ef8-bf31-4d61-972e-5772b8fe8c39\") " Mar 01 09:11:50 crc kubenswrapper[4792]: I0301 09:11:50.740982 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a193ef8-bf31-4d61-972e-5772b8fe8c39-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9a193ef8-bf31-4d61-972e-5772b8fe8c39" (UID: "9a193ef8-bf31-4d61-972e-5772b8fe8c39"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:11:50 crc kubenswrapper[4792]: I0301 09:11:50.757120 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a193ef8-bf31-4d61-972e-5772b8fe8c39-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9a193ef8-bf31-4d61-972e-5772b8fe8c39" (UID: "9a193ef8-bf31-4d61-972e-5772b8fe8c39"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:11:50 crc kubenswrapper[4792]: I0301 09:11:50.843206 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9a193ef8-bf31-4d61-972e-5772b8fe8c39-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 01 09:11:50 crc kubenswrapper[4792]: I0301 09:11:50.843251 4792 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9a193ef8-bf31-4d61-972e-5772b8fe8c39-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 01 09:11:51 crc kubenswrapper[4792]: I0301 09:11:51.212091 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9a193ef8-bf31-4d61-972e-5772b8fe8c39","Type":"ContainerDied","Data":"8c3c0f18faf3dfd4f659a530a5e1444bf2456ab64ec87d38ae446eb8262bf116"} Mar 01 09:11:51 crc kubenswrapper[4792]: I0301 09:11:51.212418 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c3c0f18faf3dfd4f659a530a5e1444bf2456ab64ec87d38ae446eb8262bf116" Mar 01 09:11:51 crc kubenswrapper[4792]: I0301 09:11:51.212170 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 01 09:11:51 crc kubenswrapper[4792]: I0301 09:11:51.605582 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:51 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:51 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:51 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:51 crc kubenswrapper[4792]: I0301 09:11:51.605635 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:52 crc kubenswrapper[4792]: I0301 09:11:52.546622 4792 ???:1] "http: TLS handshake error from 192.168.126.11:60390: no serving certificate available for the kubelet" Mar 01 09:11:52 crc kubenswrapper[4792]: I0301 09:11:52.607790 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:52 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:52 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:52 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:52 crc kubenswrapper[4792]: I0301 09:11:52.607849 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:53 crc kubenswrapper[4792]: I0301 09:11:53.187171 4792 patch_prober.go:28] interesting pod/downloads-7954f5f757-wxl8v container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" start-of-body= Mar 01 09:11:53 crc kubenswrapper[4792]: I0301 09:11:53.187462 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-wxl8v" podUID="6cc55bdf-6c0f-4d35-879f-c64c2dc4897c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" Mar 01 09:11:53 crc kubenswrapper[4792]: I0301 09:11:53.187182 4792 patch_prober.go:28] interesting pod/downloads-7954f5f757-wxl8v container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" start-of-body= Mar 01 09:11:53 crc kubenswrapper[4792]: I0301 09:11:53.187510 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wxl8v" podUID="6cc55bdf-6c0f-4d35-879f-c64c2dc4897c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" Mar 01 09:11:53 crc kubenswrapper[4792]: I0301 09:11:53.205928 4792 patch_prober.go:28] interesting pod/console-f9d7485db-zrzcg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 01 09:11:53 crc kubenswrapper[4792]: I0301 09:11:53.205970 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-zrzcg" podUID="86788093-42e5-4fa0-9595-97a910e6557e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 01 09:11:53 crc kubenswrapper[4792]: I0301 09:11:53.606173 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:53 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:53 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:53 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:53 crc kubenswrapper[4792]: I0301 09:11:53.606229 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:54 crc kubenswrapper[4792]: I0301 09:11:54.628361 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:54 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:54 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:54 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:54 crc kubenswrapper[4792]: I0301 09:11:54.628476 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:55 crc kubenswrapper[4792]: I0301 09:11:55.605296 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:55 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 01 09:11:55 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:55 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:55 crc kubenswrapper[4792]: I0301 09:11:55.605675 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:56 crc kubenswrapper[4792]: I0301 09:11:56.606064 4792 patch_prober.go:28] interesting pod/router-default-5444994796-qtg4x container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 01 09:11:56 crc kubenswrapper[4792]: [+]has-synced ok Mar 01 09:11:56 crc kubenswrapper[4792]: [+]process-running ok Mar 01 09:11:56 crc kubenswrapper[4792]: healthz check failed Mar 01 09:11:56 crc kubenswrapper[4792]: I0301 09:11:56.606117 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qtg4x" podUID="6d0571e3-5089-4157-a36a-25ecfe6a67f2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:11:56 crc kubenswrapper[4792]: I0301 09:11:56.727671 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs\") pod \"network-metrics-daemon-frm7z\" (UID: \"fa0bf523-6582-46b4-9134-28880a50b474\") " pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:11:56 crc kubenswrapper[4792]: I0301 09:11:56.729235 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 01 09:11:56 crc kubenswrapper[4792]: I0301 09:11:56.751875 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa0bf523-6582-46b4-9134-28880a50b474-metrics-certs\") pod \"network-metrics-daemon-frm7z\" (UID: \"fa0bf523-6582-46b4-9134-28880a50b474\") " pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:11:56 crc kubenswrapper[4792]: I0301 09:11:56.852630 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 01 09:11:56 crc kubenswrapper[4792]: I0301 09:11:56.861803 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-frm7z" Mar 01 09:11:57 crc kubenswrapper[4792]: I0301 09:11:57.605777 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:57 crc kubenswrapper[4792]: I0301 09:11:57.615263 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-qtg4x" Mar 01 09:11:57 crc kubenswrapper[4792]: I0301 09:11:57.840747 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5bf56554b8-qv8hr"] Mar 01 09:11:57 crc kubenswrapper[4792]: I0301 09:11:57.841024 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" podUID="2fca3ad3-b093-4857-85cb-3db2b6516dcf" containerName="controller-manager" containerID="cri-o://5e459860652998dbc975fd66e05ff3eab25257081065483e039f02fb979238f7" gracePeriod=30 Mar 01 09:11:57 crc kubenswrapper[4792]: I0301 09:11:57.860784 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp"] Mar 01 09:11:57 crc kubenswrapper[4792]: I0301 09:11:57.861097 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" podUID="7e3e715b-f024-4842-94ed-1f1e054e89c6" containerName="route-controller-manager" containerID="cri-o://3e137621be8fd281e91660714989b3524d34fe04d0d9acd500e98e9bc50b05d8" gracePeriod=30 Mar 01 09:11:58 crc kubenswrapper[4792]: I0301 09:11:58.302806 4792 ???:1] "http: TLS handshake error from 192.168.126.11:50104: no serving certificate available for the kubelet" Mar 01 09:11:59 crc kubenswrapper[4792]: I0301 09:11:59.302675 4792 generic.go:334] "Generic (PLEG): container finished" podID="7e3e715b-f024-4842-94ed-1f1e054e89c6" containerID="3e137621be8fd281e91660714989b3524d34fe04d0d9acd500e98e9bc50b05d8" exitCode=0 Mar 01 09:11:59 crc kubenswrapper[4792]: I0301 09:11:59.302710 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" event={"ID":"7e3e715b-f024-4842-94ed-1f1e054e89c6","Type":"ContainerDied","Data":"3e137621be8fd281e91660714989b3524d34fe04d0d9acd500e98e9bc50b05d8"} Mar 01 09:11:59 crc kubenswrapper[4792]: I0301 09:11:59.305119 4792 generic.go:334] "Generic (PLEG): container finished" podID="2fca3ad3-b093-4857-85cb-3db2b6516dcf" containerID="5e459860652998dbc975fd66e05ff3eab25257081065483e039f02fb979238f7" exitCode=0 Mar 01 09:11:59 crc kubenswrapper[4792]: I0301 09:11:59.305152 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" event={"ID":"2fca3ad3-b093-4857-85cb-3db2b6516dcf","Type":"ContainerDied","Data":"5e459860652998dbc975fd66e05ff3eab25257081065483e039f02fb979238f7"} Mar 01 09:12:00 crc kubenswrapper[4792]: I0301 09:12:00.132121 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539272-nq8dk"] Mar 01 09:12:00 crc kubenswrapper[4792]: E0301 09:12:00.132794 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3" containerName="collect-profiles" Mar 01 09:12:00 crc kubenswrapper[4792]: I0301 09:12:00.132814 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3" containerName="collect-profiles" Mar 01 09:12:00 crc kubenswrapper[4792]: E0301 09:12:00.132834 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a193ef8-bf31-4d61-972e-5772b8fe8c39" containerName="pruner" Mar 01 09:12:00 crc kubenswrapper[4792]: I0301 09:12:00.132839 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a193ef8-bf31-4d61-972e-5772b8fe8c39" containerName="pruner" Mar 01 09:12:00 crc kubenswrapper[4792]: E0301 09:12:00.132851 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab5a35f7-19e5-4496-b030-59dfa49a64cf" containerName="pruner" Mar 01 09:12:00 crc kubenswrapper[4792]: I0301 09:12:00.132858 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab5a35f7-19e5-4496-b030-59dfa49a64cf" containerName="pruner" Mar 01 09:12:00 crc kubenswrapper[4792]: I0301 09:12:00.132988 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a193ef8-bf31-4d61-972e-5772b8fe8c39" containerName="pruner" Mar 01 09:12:00 crc kubenswrapper[4792]: I0301 09:12:00.133002 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab5a35f7-19e5-4496-b030-59dfa49a64cf" containerName="pruner" Mar 01 09:12:00 crc kubenswrapper[4792]: I0301 09:12:00.133011 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3" containerName="collect-profiles" Mar 01 09:12:00 crc kubenswrapper[4792]: I0301 09:12:00.133382 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539272-nq8dk" Mar 01 09:12:00 crc kubenswrapper[4792]: I0301 09:12:00.136510 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:12:00 crc kubenswrapper[4792]: I0301 09:12:00.138730 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539272-nq8dk"] Mar 01 09:12:00 crc kubenswrapper[4792]: I0301 09:12:00.305231 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjz8h\" (UniqueName: \"kubernetes.io/projected/8d574e82-f840-4f0c-982d-f6a133bd64ae-kube-api-access-hjz8h\") pod \"auto-csr-approver-29539272-nq8dk\" (UID: \"8d574e82-f840-4f0c-982d-f6a133bd64ae\") " pod="openshift-infra/auto-csr-approver-29539272-nq8dk" Mar 01 09:12:00 crc kubenswrapper[4792]: I0301 09:12:00.411895 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjz8h\" (UniqueName: \"kubernetes.io/projected/8d574e82-f840-4f0c-982d-f6a133bd64ae-kube-api-access-hjz8h\") pod \"auto-csr-approver-29539272-nq8dk\" (UID: \"8d574e82-f840-4f0c-982d-f6a133bd64ae\") " pod="openshift-infra/auto-csr-approver-29539272-nq8dk" Mar 01 09:12:00 crc kubenswrapper[4792]: I0301 09:12:00.450470 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjz8h\" (UniqueName: \"kubernetes.io/projected/8d574e82-f840-4f0c-982d-f6a133bd64ae-kube-api-access-hjz8h\") pod \"auto-csr-approver-29539272-nq8dk\" (UID: \"8d574e82-f840-4f0c-982d-f6a133bd64ae\") " pod="openshift-infra/auto-csr-approver-29539272-nq8dk" Mar 01 09:12:00 crc kubenswrapper[4792]: I0301 09:12:00.455010 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539272-nq8dk" Mar 01 09:12:01 crc kubenswrapper[4792]: I0301 09:12:01.204575 4792 patch_prober.go:28] interesting pod/route-controller-manager-8ddf4cb6d-chpwp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" start-of-body= Mar 01 09:12:01 crc kubenswrapper[4792]: I0301 09:12:01.204630 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" podUID="7e3e715b-f024-4842-94ed-1f1e054e89c6" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" Mar 01 09:12:03 crc kubenswrapper[4792]: I0301 09:12:03.192018 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-wxl8v" Mar 01 09:12:03 crc kubenswrapper[4792]: I0301 09:12:03.214653 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:12:03 crc kubenswrapper[4792]: I0301 09:12:03.222302 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:12:03 crc kubenswrapper[4792]: I0301 09:12:03.764759 4792 patch_prober.go:28] interesting pod/controller-manager-5bf56554b8-qv8hr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" start-of-body= Mar 01 09:12:03 crc kubenswrapper[4792]: I0301 09:12:03.764821 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" podUID="2fca3ad3-b093-4857-85cb-3db2b6516dcf" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" Mar 01 09:12:04 crc kubenswrapper[4792]: I0301 09:12:04.943602 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:12:04 crc kubenswrapper[4792]: I0301 09:12:04.943654 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:12:05 crc kubenswrapper[4792]: I0301 09:12:05.561859 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:12:11 crc kubenswrapper[4792]: I0301 09:12:11.204135 4792 patch_prober.go:28] interesting pod/route-controller-manager-8ddf4cb6d-chpwp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" start-of-body= Mar 01 09:12:11 crc kubenswrapper[4792]: I0301 09:12:11.204183 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" podUID="7e3e715b-f024-4842-94ed-1f1e054e89c6" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" Mar 01 09:12:13 crc kubenswrapper[4792]: I0301 09:12:13.764445 4792 patch_prober.go:28] interesting pod/controller-manager-5bf56554b8-qv8hr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" start-of-body= Mar 01 09:12:13 crc kubenswrapper[4792]: I0301 09:12:13.764835 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" podUID="2fca3ad3-b093-4857-85cb-3db2b6516dcf" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" Mar 01 09:12:14 crc kubenswrapper[4792]: I0301 09:12:14.903943 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nc4gs" Mar 01 09:12:19 crc kubenswrapper[4792]: I0301 09:12:19.694933 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 01 09:12:19 crc kubenswrapper[4792]: I0301 09:12:19.695848 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 01 09:12:19 crc kubenswrapper[4792]: I0301 09:12:19.697429 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 01 09:12:19 crc kubenswrapper[4792]: I0301 09:12:19.702640 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 01 09:12:19 crc kubenswrapper[4792]: I0301 09:12:19.710091 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 01 09:12:19 crc kubenswrapper[4792]: I0301 09:12:19.763714 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15c3e52b-97f9-45e8-a7ba-c360739547e7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"15c3e52b-97f9-45e8-a7ba-c360739547e7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 01 09:12:19 crc kubenswrapper[4792]: I0301 09:12:19.763802 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15c3e52b-97f9-45e8-a7ba-c360739547e7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"15c3e52b-97f9-45e8-a7ba-c360739547e7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 01 09:12:19 crc kubenswrapper[4792]: I0301 09:12:19.864501 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15c3e52b-97f9-45e8-a7ba-c360739547e7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"15c3e52b-97f9-45e8-a7ba-c360739547e7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 01 09:12:19 crc kubenswrapper[4792]: I0301 09:12:19.864567 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15c3e52b-97f9-45e8-a7ba-c360739547e7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"15c3e52b-97f9-45e8-a7ba-c360739547e7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 01 09:12:19 crc kubenswrapper[4792]: I0301 09:12:19.864675 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15c3e52b-97f9-45e8-a7ba-c360739547e7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"15c3e52b-97f9-45e8-a7ba-c360739547e7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 01 09:12:19 crc kubenswrapper[4792]: I0301 09:12:19.887735 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15c3e52b-97f9-45e8-a7ba-c360739547e7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"15c3e52b-97f9-45e8-a7ba-c360739547e7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 01 09:12:20 crc kubenswrapper[4792]: I0301 09:12:20.071230 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 01 09:12:21 crc kubenswrapper[4792]: I0301 09:12:21.203838 4792 patch_prober.go:28] interesting pod/route-controller-manager-8ddf4cb6d-chpwp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" start-of-body= Mar 01 09:12:21 crc kubenswrapper[4792]: I0301 09:12:21.204228 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" podUID="7e3e715b-f024-4842-94ed-1f1e054e89c6" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" Mar 01 09:12:21 crc kubenswrapper[4792]: I0301 09:12:21.882994 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 01 09:12:24 crc kubenswrapper[4792]: I0301 09:12:24.495069 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 01 09:12:24 crc kubenswrapper[4792]: I0301 09:12:24.495998 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 01 09:12:24 crc kubenswrapper[4792]: I0301 09:12:24.507557 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 01 09:12:24 crc kubenswrapper[4792]: I0301 09:12:24.623299 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4bc9fd32-61ce-4fbb-b67e-1376102f5384-kube-api-access\") pod \"installer-9-crc\" (UID: \"4bc9fd32-61ce-4fbb-b67e-1376102f5384\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 01 09:12:24 crc kubenswrapper[4792]: I0301 09:12:24.623700 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4bc9fd32-61ce-4fbb-b67e-1376102f5384-var-lock\") pod \"installer-9-crc\" (UID: \"4bc9fd32-61ce-4fbb-b67e-1376102f5384\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 01 09:12:24 crc kubenswrapper[4792]: I0301 09:12:24.623845 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4bc9fd32-61ce-4fbb-b67e-1376102f5384-kubelet-dir\") pod \"installer-9-crc\" (UID: \"4bc9fd32-61ce-4fbb-b67e-1376102f5384\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 01 09:12:24 crc kubenswrapper[4792]: I0301 09:12:24.725052 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4bc9fd32-61ce-4fbb-b67e-1376102f5384-kube-api-access\") pod \"installer-9-crc\" (UID: \"4bc9fd32-61ce-4fbb-b67e-1376102f5384\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 01 09:12:24 crc kubenswrapper[4792]: I0301 09:12:24.725837 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4bc9fd32-61ce-4fbb-b67e-1376102f5384-var-lock\") pod \"installer-9-crc\" (UID: \"4bc9fd32-61ce-4fbb-b67e-1376102f5384\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 01 09:12:24 crc kubenswrapper[4792]: I0301 09:12:24.726171 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4bc9fd32-61ce-4fbb-b67e-1376102f5384-kubelet-dir\") pod \"installer-9-crc\" (UID: \"4bc9fd32-61ce-4fbb-b67e-1376102f5384\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 01 09:12:24 crc kubenswrapper[4792]: I0301 09:12:24.726067 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4bc9fd32-61ce-4fbb-b67e-1376102f5384-var-lock\") pod \"installer-9-crc\" (UID: \"4bc9fd32-61ce-4fbb-b67e-1376102f5384\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 01 09:12:24 crc kubenswrapper[4792]: I0301 09:12:24.726345 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4bc9fd32-61ce-4fbb-b67e-1376102f5384-kubelet-dir\") pod \"installer-9-crc\" (UID: \"4bc9fd32-61ce-4fbb-b67e-1376102f5384\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 01 09:12:24 crc kubenswrapper[4792]: I0301 09:12:24.744328 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4bc9fd32-61ce-4fbb-b67e-1376102f5384-kube-api-access\") pod \"installer-9-crc\" (UID: \"4bc9fd32-61ce-4fbb-b67e-1376102f5384\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 01 09:12:24 crc kubenswrapper[4792]: I0301 09:12:24.764665 4792 patch_prober.go:28] interesting pod/controller-manager-5bf56554b8-qv8hr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 01 09:12:24 crc kubenswrapper[4792]: I0301 09:12:24.766076 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" podUID="2fca3ad3-b093-4857-85cb-3db2b6516dcf" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 01 09:12:24 crc kubenswrapper[4792]: I0301 09:12:24.847207 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 01 09:12:26 crc kubenswrapper[4792]: I0301 09:12:26.981511 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.017119 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn"] Mar 01 09:12:27 crc kubenswrapper[4792]: E0301 09:12:27.017413 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e3e715b-f024-4842-94ed-1f1e054e89c6" containerName="route-controller-manager" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.018055 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e3e715b-f024-4842-94ed-1f1e054e89c6" containerName="route-controller-manager" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.018479 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e3e715b-f024-4842-94ed-1f1e054e89c6" containerName="route-controller-manager" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.022629 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.026826 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.029073 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn"] Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.060049 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e3e715b-f024-4842-94ed-1f1e054e89c6-serving-cert\") pod \"7e3e715b-f024-4842-94ed-1f1e054e89c6\" (UID: \"7e3e715b-f024-4842-94ed-1f1e054e89c6\") " Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.060093 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e3e715b-f024-4842-94ed-1f1e054e89c6-client-ca\") pod \"7e3e715b-f024-4842-94ed-1f1e054e89c6\" (UID: \"7e3e715b-f024-4842-94ed-1f1e054e89c6\") " Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.060138 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwpwk\" (UniqueName: \"kubernetes.io/projected/7e3e715b-f024-4842-94ed-1f1e054e89c6-kube-api-access-pwpwk\") pod \"7e3e715b-f024-4842-94ed-1f1e054e89c6\" (UID: \"7e3e715b-f024-4842-94ed-1f1e054e89c6\") " Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.060258 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e3e715b-f024-4842-94ed-1f1e054e89c6-config\") pod \"7e3e715b-f024-4842-94ed-1f1e054e89c6\" (UID: \"7e3e715b-f024-4842-94ed-1f1e054e89c6\") " Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.061384 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e3e715b-f024-4842-94ed-1f1e054e89c6-config" (OuterVolumeSpecName: "config") pod "7e3e715b-f024-4842-94ed-1f1e054e89c6" (UID: "7e3e715b-f024-4842-94ed-1f1e054e89c6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.062322 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e3e715b-f024-4842-94ed-1f1e054e89c6-client-ca" (OuterVolumeSpecName: "client-ca") pod "7e3e715b-f024-4842-94ed-1f1e054e89c6" (UID: "7e3e715b-f024-4842-94ed-1f1e054e89c6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.069737 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e3e715b-f024-4842-94ed-1f1e054e89c6-kube-api-access-pwpwk" (OuterVolumeSpecName: "kube-api-access-pwpwk") pod "7e3e715b-f024-4842-94ed-1f1e054e89c6" (UID: "7e3e715b-f024-4842-94ed-1f1e054e89c6"). InnerVolumeSpecName "kube-api-access-pwpwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.094613 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e3e715b-f024-4842-94ed-1f1e054e89c6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7e3e715b-f024-4842-94ed-1f1e054e89c6" (UID: "7e3e715b-f024-4842-94ed-1f1e054e89c6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.161647 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fca3ad3-b093-4857-85cb-3db2b6516dcf-config\") pod \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.161735 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fca3ad3-b093-4857-85cb-3db2b6516dcf-serving-cert\") pod \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.161788 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pp7l\" (UniqueName: \"kubernetes.io/projected/2fca3ad3-b093-4857-85cb-3db2b6516dcf-kube-api-access-4pp7l\") pod \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.161820 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2fca3ad3-b093-4857-85cb-3db2b6516dcf-client-ca\") pod \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.161842 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2fca3ad3-b093-4857-85cb-3db2b6516dcf-proxy-ca-bundles\") pod \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\" (UID: \"2fca3ad3-b093-4857-85cb-3db2b6516dcf\") " Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.162093 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz4dk\" (UniqueName: \"kubernetes.io/projected/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-kube-api-access-pz4dk\") pod \"route-controller-manager-6947bdc57d-4m2vn\" (UID: \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\") " pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.162139 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-serving-cert\") pod \"route-controller-manager-6947bdc57d-4m2vn\" (UID: \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\") " pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.162184 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-client-ca\") pod \"route-controller-manager-6947bdc57d-4m2vn\" (UID: \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\") " pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.162247 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-config\") pod \"route-controller-manager-6947bdc57d-4m2vn\" (UID: \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\") " pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.162290 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e3e715b-f024-4842-94ed-1f1e054e89c6-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.162304 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e3e715b-f024-4842-94ed-1f1e054e89c6-client-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.162321 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwpwk\" (UniqueName: \"kubernetes.io/projected/7e3e715b-f024-4842-94ed-1f1e054e89c6-kube-api-access-pwpwk\") on node \"crc\" DevicePath \"\"" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.162363 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e3e715b-f024-4842-94ed-1f1e054e89c6-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.162643 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fca3ad3-b093-4857-85cb-3db2b6516dcf-client-ca" (OuterVolumeSpecName: "client-ca") pod "2fca3ad3-b093-4857-85cb-3db2b6516dcf" (UID: "2fca3ad3-b093-4857-85cb-3db2b6516dcf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.162733 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fca3ad3-b093-4857-85cb-3db2b6516dcf-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2fca3ad3-b093-4857-85cb-3db2b6516dcf" (UID: "2fca3ad3-b093-4857-85cb-3db2b6516dcf"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.162896 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fca3ad3-b093-4857-85cb-3db2b6516dcf-config" (OuterVolumeSpecName: "config") pod "2fca3ad3-b093-4857-85cb-3db2b6516dcf" (UID: "2fca3ad3-b093-4857-85cb-3db2b6516dcf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.165064 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fca3ad3-b093-4857-85cb-3db2b6516dcf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2fca3ad3-b093-4857-85cb-3db2b6516dcf" (UID: "2fca3ad3-b093-4857-85cb-3db2b6516dcf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.165134 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fca3ad3-b093-4857-85cb-3db2b6516dcf-kube-api-access-4pp7l" (OuterVolumeSpecName: "kube-api-access-4pp7l") pod "2fca3ad3-b093-4857-85cb-3db2b6516dcf" (UID: "2fca3ad3-b093-4857-85cb-3db2b6516dcf"). InnerVolumeSpecName "kube-api-access-4pp7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.263589 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-config\") pod \"route-controller-manager-6947bdc57d-4m2vn\" (UID: \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\") " pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.263642 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz4dk\" (UniqueName: \"kubernetes.io/projected/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-kube-api-access-pz4dk\") pod \"route-controller-manager-6947bdc57d-4m2vn\" (UID: \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\") " pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.263676 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-serving-cert\") pod \"route-controller-manager-6947bdc57d-4m2vn\" (UID: \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\") " pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.263706 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-client-ca\") pod \"route-controller-manager-6947bdc57d-4m2vn\" (UID: \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\") " pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.263761 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pp7l\" (UniqueName: \"kubernetes.io/projected/2fca3ad3-b093-4857-85cb-3db2b6516dcf-kube-api-access-4pp7l\") on node \"crc\" DevicePath \"\"" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.263773 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2fca3ad3-b093-4857-85cb-3db2b6516dcf-client-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.263782 4792 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2fca3ad3-b093-4857-85cb-3db2b6516dcf-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.263793 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fca3ad3-b093-4857-85cb-3db2b6516dcf-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.263801 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fca3ad3-b093-4857-85cb-3db2b6516dcf-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.264791 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-client-ca\") pod \"route-controller-manager-6947bdc57d-4m2vn\" (UID: \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\") " pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.265078 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-config\") pod \"route-controller-manager-6947bdc57d-4m2vn\" (UID: \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\") " pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.266882 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-serving-cert\") pod \"route-controller-manager-6947bdc57d-4m2vn\" (UID: \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\") " pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.279311 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz4dk\" (UniqueName: \"kubernetes.io/projected/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-kube-api-access-pz4dk\") pod \"route-controller-manager-6947bdc57d-4m2vn\" (UID: \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\") " pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.341119 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.469595 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" event={"ID":"7e3e715b-f024-4842-94ed-1f1e054e89c6","Type":"ContainerDied","Data":"92e007ff2e490cd943aeebddedb6f93d838bbb65607bec7ae9a1be207a92d259"} Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.469629 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.469650 4792 scope.go:117] "RemoveContainer" containerID="3e137621be8fd281e91660714989b3524d34fe04d0d9acd500e98e9bc50b05d8" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.473550 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" event={"ID":"2fca3ad3-b093-4857-85cb-3db2b6516dcf","Type":"ContainerDied","Data":"834122f1bb558979716c9151687e893e13e6120e36e39817e6beb2432dd0f0fd"} Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.473608 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bf56554b8-qv8hr" Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.490870 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp"] Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.495095 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8ddf4cb6d-chpwp"] Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.498940 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5bf56554b8-qv8hr"] Mar 01 09:12:27 crc kubenswrapper[4792]: I0301 09:12:27.502040 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5bf56554b8-qv8hr"] Mar 01 09:12:28 crc kubenswrapper[4792]: E0301 09:12:28.172172 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 01 09:12:28 crc kubenswrapper[4792]: E0301 09:12:28.172339 4792 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 01 09:12:28 crc kubenswrapper[4792]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 01 09:12:28 crc kubenswrapper[4792]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ql5bq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29539270-q7hck_openshift-infra(b4130507-2de2-48c2-9c3f-e9474aeca556): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 01 09:12:28 crc kubenswrapper[4792]: > logger="UnhandledError" Mar 01 09:12:28 crc kubenswrapper[4792]: E0301 09:12:28.173866 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29539270-q7hck" podUID="b4130507-2de2-48c2-9c3f-e9474aeca556" Mar 01 09:12:28 crc kubenswrapper[4792]: E0301 09:12:28.481353 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29539270-q7hck" podUID="b4130507-2de2-48c2-9c3f-e9474aeca556" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.416028 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fca3ad3-b093-4857-85cb-3db2b6516dcf" path="/var/lib/kubelet/pods/2fca3ad3-b093-4857-85cb-3db2b6516dcf/volumes" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.416991 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e3e715b-f024-4842-94ed-1f1e054e89c6" path="/var/lib/kubelet/pods/7e3e715b-f024-4842-94ed-1f1e054e89c6/volumes" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.487043 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-848d8759d6-kmmxk"] Mar 01 09:12:29 crc kubenswrapper[4792]: E0301 09:12:29.487279 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fca3ad3-b093-4857-85cb-3db2b6516dcf" containerName="controller-manager" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.487295 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fca3ad3-b093-4857-85cb-3db2b6516dcf" containerName="controller-manager" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.487393 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fca3ad3-b093-4857-85cb-3db2b6516dcf" containerName="controller-manager" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.487808 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.490003 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.490161 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.491282 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.491514 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.492177 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.492387 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-848d8759d6-kmmxk"] Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.492480 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.499475 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.591701 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l558v\" (UniqueName: \"kubernetes.io/projected/1761bae5-8e03-478f-938c-df41041a062c-kube-api-access-l558v\") pod \"controller-manager-848d8759d6-kmmxk\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.591759 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1761bae5-8e03-478f-938c-df41041a062c-client-ca\") pod \"controller-manager-848d8759d6-kmmxk\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.591876 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1761bae5-8e03-478f-938c-df41041a062c-config\") pod \"controller-manager-848d8759d6-kmmxk\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.592087 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1761bae5-8e03-478f-938c-df41041a062c-proxy-ca-bundles\") pod \"controller-manager-848d8759d6-kmmxk\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.592186 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1761bae5-8e03-478f-938c-df41041a062c-serving-cert\") pod \"controller-manager-848d8759d6-kmmxk\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.693869 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1761bae5-8e03-478f-938c-df41041a062c-config\") pod \"controller-manager-848d8759d6-kmmxk\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.693973 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1761bae5-8e03-478f-938c-df41041a062c-proxy-ca-bundles\") pod \"controller-manager-848d8759d6-kmmxk\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.694034 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1761bae5-8e03-478f-938c-df41041a062c-serving-cert\") pod \"controller-manager-848d8759d6-kmmxk\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.694107 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l558v\" (UniqueName: \"kubernetes.io/projected/1761bae5-8e03-478f-938c-df41041a062c-kube-api-access-l558v\") pod \"controller-manager-848d8759d6-kmmxk\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.694146 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1761bae5-8e03-478f-938c-df41041a062c-client-ca\") pod \"controller-manager-848d8759d6-kmmxk\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.694946 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1761bae5-8e03-478f-938c-df41041a062c-client-ca\") pod \"controller-manager-848d8759d6-kmmxk\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.695368 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1761bae5-8e03-478f-938c-df41041a062c-config\") pod \"controller-manager-848d8759d6-kmmxk\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.695741 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1761bae5-8e03-478f-938c-df41041a062c-proxy-ca-bundles\") pod \"controller-manager-848d8759d6-kmmxk\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.699631 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1761bae5-8e03-478f-938c-df41041a062c-serving-cert\") pod \"controller-manager-848d8759d6-kmmxk\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.720452 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l558v\" (UniqueName: \"kubernetes.io/projected/1761bae5-8e03-478f-938c-df41041a062c-kube-api-access-l558v\") pod \"controller-manager-848d8759d6-kmmxk\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:29 crc kubenswrapper[4792]: I0301 09:12:29.811153 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:29 crc kubenswrapper[4792]: E0301 09:12:29.953301 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 01 09:12:30 crc kubenswrapper[4792]: E0301 09:12:29.954159 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mjlvw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-zwt8t_openshift-marketplace(fee8fc8f-8d72-4606-b115-4197f599cfcb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 01 09:12:30 crc kubenswrapper[4792]: E0301 09:12:29.955396 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-zwt8t" podUID="fee8fc8f-8d72-4606-b115-4197f599cfcb" Mar 01 09:12:30 crc kubenswrapper[4792]: E0301 09:12:30.133604 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 01 09:12:30 crc kubenswrapper[4792]: E0301 09:12:30.133760 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kc5gl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-n28r8_openshift-marketplace(e03dee1d-d7ca-422c-8af6-faa0c1af3863): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 01 09:12:30 crc kubenswrapper[4792]: E0301 09:12:30.134928 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-n28r8" podUID="e03dee1d-d7ca-422c-8af6-faa0c1af3863" Mar 01 09:12:33 crc kubenswrapper[4792]: E0301 09:12:33.628378 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-zwt8t" podUID="fee8fc8f-8d72-4606-b115-4197f599cfcb" Mar 01 09:12:33 crc kubenswrapper[4792]: E0301 09:12:33.628382 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-n28r8" podUID="e03dee1d-d7ca-422c-8af6-faa0c1af3863" Mar 01 09:12:33 crc kubenswrapper[4792]: E0301 09:12:33.704230 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 01 09:12:33 crc kubenswrapper[4792]: E0301 09:12:33.704603 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4tbnw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-ftk4v_openshift-marketplace(e93e87c0-86c5-446d-9f43-71d17960a351): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 01 09:12:33 crc kubenswrapper[4792]: E0301 09:12:33.705858 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-ftk4v" podUID="e93e87c0-86c5-446d-9f43-71d17960a351" Mar 01 09:12:34 crc kubenswrapper[4792]: I0301 09:12:34.942894 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:12:34 crc kubenswrapper[4792]: I0301 09:12:34.942968 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:12:36 crc kubenswrapper[4792]: E0301 09:12:36.494144 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-ftk4v" podUID="e93e87c0-86c5-446d-9f43-71d17960a351" Mar 01 09:12:37 crc kubenswrapper[4792]: E0301 09:12:37.706811 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 01 09:12:37 crc kubenswrapper[4792]: E0301 09:12:37.706968 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jn884,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-l7ngd_openshift-marketplace(8333a325-229b-4dfd-a1f8-966f39bf55fc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 01 09:12:37 crc kubenswrapper[4792]: E0301 09:12:37.708267 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-l7ngd" podUID="8333a325-229b-4dfd-a1f8-966f39bf55fc" Mar 01 09:12:39 crc kubenswrapper[4792]: I0301 09:12:39.291172 4792 ???:1] "http: TLS handshake error from 192.168.126.11:47816: no serving certificate available for the kubelet" Mar 01 09:12:40 crc kubenswrapper[4792]: E0301 09:12:40.009697 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-l7ngd" podUID="8333a325-229b-4dfd-a1f8-966f39bf55fc" Mar 01 09:12:40 crc kubenswrapper[4792]: I0301 09:12:40.047886 4792 scope.go:117] "RemoveContainer" containerID="5e459860652998dbc975fd66e05ff3eab25257081065483e039f02fb979238f7" Mar 01 09:12:40 crc kubenswrapper[4792]: I0301 09:12:40.397018 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn"] Mar 01 09:12:40 crc kubenswrapper[4792]: I0301 09:12:40.503538 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539272-nq8dk"] Mar 01 09:12:40 crc kubenswrapper[4792]: I0301 09:12:40.523989 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-frm7z"] Mar 01 09:12:40 crc kubenswrapper[4792]: I0301 09:12:40.541107 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-frm7z" event={"ID":"fa0bf523-6582-46b4-9134-28880a50b474","Type":"ContainerStarted","Data":"d68ef00952b1357bb1e49103f75298e3f4ea20890953a998565a8e4d1e6a7bca"} Mar 01 09:12:40 crc kubenswrapper[4792]: I0301 09:12:40.548049 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" event={"ID":"810605ea-bf2f-4cd2-87a9-a09e9d5e7110","Type":"ContainerStarted","Data":"ec7c242400c3788f70c2a91c8d7378bde577b0383ee21a7d115406b72e0bfbe5"} Mar 01 09:12:40 crc kubenswrapper[4792]: I0301 09:12:40.549312 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539272-nq8dk" event={"ID":"8d574e82-f840-4f0c-982d-f6a133bd64ae","Type":"ContainerStarted","Data":"41f9d0eeab6a97d7a4669fab8bc8a58ebb5b5c501c74bede3e0472eab0fbf2e6"} Mar 01 09:12:40 crc kubenswrapper[4792]: I0301 09:12:40.609606 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 01 09:12:40 crc kubenswrapper[4792]: W0301 09:12:40.615112 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4bc9fd32_61ce_4fbb_b67e_1376102f5384.slice/crio-e1aa5214c0457299056866d70c565c5094c4dfa52c85c6428fa756810ed8da10 WatchSource:0}: Error finding container e1aa5214c0457299056866d70c565c5094c4dfa52c85c6428fa756810ed8da10: Status 404 returned error can't find the container with id e1aa5214c0457299056866d70c565c5094c4dfa52c85c6428fa756810ed8da10 Mar 01 09:12:40 crc kubenswrapper[4792]: I0301 09:12:40.620796 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 01 09:12:40 crc kubenswrapper[4792]: I0301 09:12:40.683374 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-848d8759d6-kmmxk"] Mar 01 09:12:40 crc kubenswrapper[4792]: W0301 09:12:40.689595 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1761bae5_8e03_478f_938c_df41041a062c.slice/crio-7470d72dfdecb7555879e3e75bbc32cdde09ed41a6f66ab3ea84fdfcb9d1191a WatchSource:0}: Error finding container 7470d72dfdecb7555879e3e75bbc32cdde09ed41a6f66ab3ea84fdfcb9d1191a: Status 404 returned error can't find the container with id 7470d72dfdecb7555879e3e75bbc32cdde09ed41a6f66ab3ea84fdfcb9d1191a Mar 01 09:12:41 crc kubenswrapper[4792]: I0301 09:12:41.556644 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"4bc9fd32-61ce-4fbb-b67e-1376102f5384","Type":"ContainerStarted","Data":"e1aa5214c0457299056866d70c565c5094c4dfa52c85c6428fa756810ed8da10"} Mar 01 09:12:41 crc kubenswrapper[4792]: I0301 09:12:41.559498 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"15c3e52b-97f9-45e8-a7ba-c360739547e7","Type":"ContainerStarted","Data":"f382784376f22fb9daf42b2e14133ec72dd7414c987c151edf216564d1c7d317"} Mar 01 09:12:41 crc kubenswrapper[4792]: I0301 09:12:41.559548 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"15c3e52b-97f9-45e8-a7ba-c360739547e7","Type":"ContainerStarted","Data":"6dc5a90f68bbb98f1b1144b481316b3233e7eb33cd6b34c20b580259bfcc2ec8"} Mar 01 09:12:41 crc kubenswrapper[4792]: I0301 09:12:41.560797 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" event={"ID":"1761bae5-8e03-478f-938c-df41041a062c","Type":"ContainerStarted","Data":"7470d72dfdecb7555879e3e75bbc32cdde09ed41a6f66ab3ea84fdfcb9d1191a"} Mar 01 09:12:41 crc kubenswrapper[4792]: I0301 09:12:41.562044 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" event={"ID":"810605ea-bf2f-4cd2-87a9-a09e9d5e7110","Type":"ContainerStarted","Data":"6c2c0cfa98afb0d975335d6142781b6265b7d1ccb65a4ca10172168101d7316d"} Mar 01 09:12:42 crc kubenswrapper[4792]: E0301 09:12:42.238463 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 01 09:12:42 crc kubenswrapper[4792]: E0301 09:12:42.238827 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-whj6h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-p6b9w_openshift-marketplace(6fd91972-6bfc-4041-abc2-8f4298584603): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 01 09:12:42 crc kubenswrapper[4792]: E0301 09:12:42.240187 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-p6b9w" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" Mar 01 09:12:42 crc kubenswrapper[4792]: I0301 09:12:42.568201 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-frm7z" event={"ID":"fa0bf523-6582-46b4-9134-28880a50b474","Type":"ContainerStarted","Data":"41bb3bce4ab3fe37dbba668e29eee509bd06b39592fb65b56c393ef53fd7337f"} Mar 01 09:12:42 crc kubenswrapper[4792]: I0301 09:12:42.571374 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"4bc9fd32-61ce-4fbb-b67e-1376102f5384","Type":"ContainerStarted","Data":"7ee832f8488525fdd9e2872f5f8c217b74292b7dc7a5f5c0537fec99a3845c5f"} Mar 01 09:12:42 crc kubenswrapper[4792]: I0301 09:12:42.573162 4792 generic.go:334] "Generic (PLEG): container finished" podID="15c3e52b-97f9-45e8-a7ba-c360739547e7" containerID="f382784376f22fb9daf42b2e14133ec72dd7414c987c151edf216564d1c7d317" exitCode=0 Mar 01 09:12:42 crc kubenswrapper[4792]: I0301 09:12:42.573219 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"15c3e52b-97f9-45e8-a7ba-c360739547e7","Type":"ContainerDied","Data":"f382784376f22fb9daf42b2e14133ec72dd7414c987c151edf216564d1c7d317"} Mar 01 09:12:42 crc kubenswrapper[4792]: I0301 09:12:42.574970 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" event={"ID":"1761bae5-8e03-478f-938c-df41041a062c","Type":"ContainerStarted","Data":"c6c92a3162c61d2929d59a565ad6956d560ccb189eb5098f9d86e3c3e62e3291"} Mar 01 09:12:42 crc kubenswrapper[4792]: I0301 09:12:42.575403 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:42 crc kubenswrapper[4792]: E0301 09:12:42.586628 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-p6b9w" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" Mar 01 09:12:42 crc kubenswrapper[4792]: I0301 09:12:42.592069 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=18.592054354 podStartE2EDuration="18.592054354s" podCreationTimestamp="2026-03-01 09:12:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:12:42.588864194 +0000 UTC m=+291.830743391" watchObservedRunningTime="2026-03-01 09:12:42.592054354 +0000 UTC m=+291.833933541" Mar 01 09:12:42 crc kubenswrapper[4792]: I0301 09:12:42.611078 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:42 crc kubenswrapper[4792]: I0301 09:12:42.690839 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" podStartSLOduration=25.690822812 podStartE2EDuration="25.690822812s" podCreationTimestamp="2026-03-01 09:12:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:12:42.688997146 +0000 UTC m=+291.930876343" watchObservedRunningTime="2026-03-01 09:12:42.690822812 +0000 UTC m=+291.932702009" Mar 01 09:12:42 crc kubenswrapper[4792]: I0301 09:12:42.725403 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" podStartSLOduration=25.725386745 podStartE2EDuration="25.725386745s" podCreationTimestamp="2026-03-01 09:12:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:12:42.722993755 +0000 UTC m=+291.964872952" watchObservedRunningTime="2026-03-01 09:12:42.725386745 +0000 UTC m=+291.967265942" Mar 01 09:12:42 crc kubenswrapper[4792]: E0301 09:12:42.977597 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 01 09:12:42 crc kubenswrapper[4792]: E0301 09:12:42.977740 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nc28k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-wxb87_openshift-marketplace(9073e3da-2d6f-48a3-907a-e347f28559ae): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 01 09:12:42 crc kubenswrapper[4792]: E0301 09:12:42.980056 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-wxb87" podUID="9073e3da-2d6f-48a3-907a-e347f28559ae" Mar 01 09:12:43 crc kubenswrapper[4792]: I0301 09:12:43.259708 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-prqqp"] Mar 01 09:12:43 crc kubenswrapper[4792]: I0301 09:12:43.580496 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-frm7z" event={"ID":"fa0bf523-6582-46b4-9134-28880a50b474","Type":"ContainerStarted","Data":"842d74ac38d89984faf44a260f0f962bba2e310a1347d674617143778b1ee622"} Mar 01 09:12:43 crc kubenswrapper[4792]: E0301 09:12:43.581836 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-wxb87" podUID="9073e3da-2d6f-48a3-907a-e347f28559ae" Mar 01 09:12:43 crc kubenswrapper[4792]: I0301 09:12:43.602394 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-frm7z" podStartSLOduration=231.602365598 podStartE2EDuration="3m51.602365598s" podCreationTimestamp="2026-03-01 09:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:12:43.599111926 +0000 UTC m=+292.840991133" watchObservedRunningTime="2026-03-01 09:12:43.602365598 +0000 UTC m=+292.844244795" Mar 01 09:12:43 crc kubenswrapper[4792]: I0301 09:12:43.833185 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 01 09:12:44 crc kubenswrapper[4792]: I0301 09:12:44.020759 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15c3e52b-97f9-45e8-a7ba-c360739547e7-kube-api-access\") pod \"15c3e52b-97f9-45e8-a7ba-c360739547e7\" (UID: \"15c3e52b-97f9-45e8-a7ba-c360739547e7\") " Mar 01 09:12:44 crc kubenswrapper[4792]: I0301 09:12:44.020835 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15c3e52b-97f9-45e8-a7ba-c360739547e7-kubelet-dir\") pod \"15c3e52b-97f9-45e8-a7ba-c360739547e7\" (UID: \"15c3e52b-97f9-45e8-a7ba-c360739547e7\") " Mar 01 09:12:44 crc kubenswrapper[4792]: I0301 09:12:44.021213 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15c3e52b-97f9-45e8-a7ba-c360739547e7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "15c3e52b-97f9-45e8-a7ba-c360739547e7" (UID: "15c3e52b-97f9-45e8-a7ba-c360739547e7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:12:44 crc kubenswrapper[4792]: I0301 09:12:44.025848 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15c3e52b-97f9-45e8-a7ba-c360739547e7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "15c3e52b-97f9-45e8-a7ba-c360739547e7" (UID: "15c3e52b-97f9-45e8-a7ba-c360739547e7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:12:44 crc kubenswrapper[4792]: I0301 09:12:44.121839 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15c3e52b-97f9-45e8-a7ba-c360739547e7-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 01 09:12:44 crc kubenswrapper[4792]: I0301 09:12:44.121882 4792 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15c3e52b-97f9-45e8-a7ba-c360739547e7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 01 09:12:44 crc kubenswrapper[4792]: I0301 09:12:44.589973 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 01 09:12:44 crc kubenswrapper[4792]: I0301 09:12:44.591991 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"15c3e52b-97f9-45e8-a7ba-c360739547e7","Type":"ContainerDied","Data":"6dc5a90f68bbb98f1b1144b481316b3233e7eb33cd6b34c20b580259bfcc2ec8"} Mar 01 09:12:44 crc kubenswrapper[4792]: I0301 09:12:44.592027 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dc5a90f68bbb98f1b1144b481316b3233e7eb33cd6b34c20b580259bfcc2ec8" Mar 01 09:12:44 crc kubenswrapper[4792]: E0301 09:12:44.643521 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 01 09:12:44 crc kubenswrapper[4792]: E0301 09:12:44.643947 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-thnsb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-cw675_openshift-marketplace(dff0d675-52dd-4cac-a7be-8750333c28e3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 01 09:12:44 crc kubenswrapper[4792]: E0301 09:12:44.645192 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-cw675" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" Mar 01 09:12:44 crc kubenswrapper[4792]: E0301 09:12:44.848016 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 01 09:12:44 crc kubenswrapper[4792]: E0301 09:12:44.848204 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-djdkr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-7nwc2_openshift-marketplace(22c7c368-3523-4224-aebd-59b29640bed0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 01 09:12:44 crc kubenswrapper[4792]: E0301 09:12:44.849414 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-7nwc2" podUID="22c7c368-3523-4224-aebd-59b29640bed0" Mar 01 09:12:45 crc kubenswrapper[4792]: I0301 09:12:45.416718 4792 csr.go:261] certificate signing request csr-tfd7m is approved, waiting to be issued Mar 01 09:12:45 crc kubenswrapper[4792]: I0301 09:12:45.422590 4792 csr.go:257] certificate signing request csr-tfd7m is issued Mar 01 09:12:45 crc kubenswrapper[4792]: I0301 09:12:45.595991 4792 generic.go:334] "Generic (PLEG): container finished" podID="b4130507-2de2-48c2-9c3f-e9474aeca556" containerID="f692f356115e5b53ef6a4d81f9a4c258c05c49397508f23df7e1bd78fc94331c" exitCode=0 Mar 01 09:12:45 crc kubenswrapper[4792]: I0301 09:12:45.596065 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539270-q7hck" event={"ID":"b4130507-2de2-48c2-9c3f-e9474aeca556","Type":"ContainerDied","Data":"f692f356115e5b53ef6a4d81f9a4c258c05c49397508f23df7e1bd78fc94331c"} Mar 01 09:12:45 crc kubenswrapper[4792]: E0301 09:12:45.598164 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-cw675" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" Mar 01 09:12:45 crc kubenswrapper[4792]: E0301 09:12:45.598182 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7nwc2" podUID="22c7c368-3523-4224-aebd-59b29640bed0" Mar 01 09:12:46 crc kubenswrapper[4792]: I0301 09:12:46.423995 4792 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-19 07:52:47.639347513 +0000 UTC Mar 01 09:12:46 crc kubenswrapper[4792]: I0301 09:12:46.424028 4792 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7030h40m1.215321809s for next certificate rotation Mar 01 09:12:46 crc kubenswrapper[4792]: I0301 09:12:46.601521 4792 generic.go:334] "Generic (PLEG): container finished" podID="8d574e82-f840-4f0c-982d-f6a133bd64ae" containerID="5f51f2f66c61a102a6b43ee525bfb8b5ff9da77472d4107c4db4ba5e29f6a9ee" exitCode=0 Mar 01 09:12:46 crc kubenswrapper[4792]: I0301 09:12:46.601615 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539272-nq8dk" event={"ID":"8d574e82-f840-4f0c-982d-f6a133bd64ae","Type":"ContainerDied","Data":"5f51f2f66c61a102a6b43ee525bfb8b5ff9da77472d4107c4db4ba5e29f6a9ee"} Mar 01 09:12:46 crc kubenswrapper[4792]: I0301 09:12:46.890542 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539270-q7hck" Mar 01 09:12:47 crc kubenswrapper[4792]: I0301 09:12:47.059839 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql5bq\" (UniqueName: \"kubernetes.io/projected/b4130507-2de2-48c2-9c3f-e9474aeca556-kube-api-access-ql5bq\") pod \"b4130507-2de2-48c2-9c3f-e9474aeca556\" (UID: \"b4130507-2de2-48c2-9c3f-e9474aeca556\") " Mar 01 09:12:47 crc kubenswrapper[4792]: I0301 09:12:47.065367 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4130507-2de2-48c2-9c3f-e9474aeca556-kube-api-access-ql5bq" (OuterVolumeSpecName: "kube-api-access-ql5bq") pod "b4130507-2de2-48c2-9c3f-e9474aeca556" (UID: "b4130507-2de2-48c2-9c3f-e9474aeca556"). InnerVolumeSpecName "kube-api-access-ql5bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:12:47 crc kubenswrapper[4792]: I0301 09:12:47.161717 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql5bq\" (UniqueName: \"kubernetes.io/projected/b4130507-2de2-48c2-9c3f-e9474aeca556-kube-api-access-ql5bq\") on node \"crc\" DevicePath \"\"" Mar 01 09:12:47 crc kubenswrapper[4792]: I0301 09:12:47.342016 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" Mar 01 09:12:47 crc kubenswrapper[4792]: I0301 09:12:47.346870 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" Mar 01 09:12:47 crc kubenswrapper[4792]: I0301 09:12:47.431110 4792 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-25 19:58:31.172266767 +0000 UTC Mar 01 09:12:47 crc kubenswrapper[4792]: I0301 09:12:47.431138 4792 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7186h45m43.741130849s for next certificate rotation Mar 01 09:12:47 crc kubenswrapper[4792]: I0301 09:12:47.607998 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539270-q7hck" event={"ID":"b4130507-2de2-48c2-9c3f-e9474aeca556","Type":"ContainerDied","Data":"ff027c882d14f4f304f09047fac83f66f0a08cd7a1d80eea6711ce8cef3e6448"} Mar 01 09:12:47 crc kubenswrapper[4792]: I0301 09:12:47.608034 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff027c882d14f4f304f09047fac83f66f0a08cd7a1d80eea6711ce8cef3e6448" Mar 01 09:12:47 crc kubenswrapper[4792]: I0301 09:12:47.608118 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539270-q7hck" Mar 01 09:12:47 crc kubenswrapper[4792]: I0301 09:12:47.884358 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539272-nq8dk" Mar 01 09:12:47 crc kubenswrapper[4792]: I0301 09:12:47.972366 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjz8h\" (UniqueName: \"kubernetes.io/projected/8d574e82-f840-4f0c-982d-f6a133bd64ae-kube-api-access-hjz8h\") pod \"8d574e82-f840-4f0c-982d-f6a133bd64ae\" (UID: \"8d574e82-f840-4f0c-982d-f6a133bd64ae\") " Mar 01 09:12:47 crc kubenswrapper[4792]: I0301 09:12:47.975830 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d574e82-f840-4f0c-982d-f6a133bd64ae-kube-api-access-hjz8h" (OuterVolumeSpecName: "kube-api-access-hjz8h") pod "8d574e82-f840-4f0c-982d-f6a133bd64ae" (UID: "8d574e82-f840-4f0c-982d-f6a133bd64ae"). InnerVolumeSpecName "kube-api-access-hjz8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:12:48 crc kubenswrapper[4792]: I0301 09:12:48.074469 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjz8h\" (UniqueName: \"kubernetes.io/projected/8d574e82-f840-4f0c-982d-f6a133bd64ae-kube-api-access-hjz8h\") on node \"crc\" DevicePath \"\"" Mar 01 09:12:48 crc kubenswrapper[4792]: I0301 09:12:48.615245 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n28r8" event={"ID":"e03dee1d-d7ca-422c-8af6-faa0c1af3863","Type":"ContainerStarted","Data":"d7a22eb25032f48508905d8110fc2779a9b8e1d8380aa44e93999853b14a1f56"} Mar 01 09:12:48 crc kubenswrapper[4792]: I0301 09:12:48.616521 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539272-nq8dk" event={"ID":"8d574e82-f840-4f0c-982d-f6a133bd64ae","Type":"ContainerDied","Data":"41f9d0eeab6a97d7a4669fab8bc8a58ebb5b5c501c74bede3e0472eab0fbf2e6"} Mar 01 09:12:48 crc kubenswrapper[4792]: I0301 09:12:48.616599 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41f9d0eeab6a97d7a4669fab8bc8a58ebb5b5c501c74bede3e0472eab0fbf2e6" Mar 01 09:12:48 crc kubenswrapper[4792]: I0301 09:12:48.616612 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539272-nq8dk" Mar 01 09:12:49 crc kubenswrapper[4792]: I0301 09:12:49.621606 4792 generic.go:334] "Generic (PLEG): container finished" podID="e03dee1d-d7ca-422c-8af6-faa0c1af3863" containerID="d7a22eb25032f48508905d8110fc2779a9b8e1d8380aa44e93999853b14a1f56" exitCode=0 Mar 01 09:12:49 crc kubenswrapper[4792]: I0301 09:12:49.622658 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n28r8" event={"ID":"e03dee1d-d7ca-422c-8af6-faa0c1af3863","Type":"ContainerDied","Data":"d7a22eb25032f48508905d8110fc2779a9b8e1d8380aa44e93999853b14a1f56"} Mar 01 09:12:49 crc kubenswrapper[4792]: I0301 09:12:49.625006 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ftk4v" event={"ID":"e93e87c0-86c5-446d-9f43-71d17960a351","Type":"ContainerStarted","Data":"4ae585f0378f09e8557ad2a19a0474c0c5f7679c6fe6f603a65e86c389831b72"} Mar 01 09:12:50 crc kubenswrapper[4792]: I0301 09:12:50.636294 4792 generic.go:334] "Generic (PLEG): container finished" podID="fee8fc8f-8d72-4606-b115-4197f599cfcb" containerID="8da1a6a75bb09923b49fb00136c53f3ad6da4b84f38958d2ae68061cac2e183c" exitCode=0 Mar 01 09:12:50 crc kubenswrapper[4792]: I0301 09:12:50.636331 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zwt8t" event={"ID":"fee8fc8f-8d72-4606-b115-4197f599cfcb","Type":"ContainerDied","Data":"8da1a6a75bb09923b49fb00136c53f3ad6da4b84f38958d2ae68061cac2e183c"} Mar 01 09:12:50 crc kubenswrapper[4792]: I0301 09:12:50.638132 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n28r8" event={"ID":"e03dee1d-d7ca-422c-8af6-faa0c1af3863","Type":"ContainerStarted","Data":"1897e3e89e2859f0dcfe575b896a35cdde57822147e478674713823e7d25153f"} Mar 01 09:12:50 crc kubenswrapper[4792]: I0301 09:12:50.642894 4792 generic.go:334] "Generic (PLEG): container finished" podID="e93e87c0-86c5-446d-9f43-71d17960a351" containerID="4ae585f0378f09e8557ad2a19a0474c0c5f7679c6fe6f603a65e86c389831b72" exitCode=0 Mar 01 09:12:50 crc kubenswrapper[4792]: I0301 09:12:50.642950 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ftk4v" event={"ID":"e93e87c0-86c5-446d-9f43-71d17960a351","Type":"ContainerDied","Data":"4ae585f0378f09e8557ad2a19a0474c0c5f7679c6fe6f603a65e86c389831b72"} Mar 01 09:12:50 crc kubenswrapper[4792]: I0301 09:12:50.672898 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-n28r8" podStartSLOduration=3.532604609 podStartE2EDuration="1m6.67288009s" podCreationTimestamp="2026-03-01 09:11:44 +0000 UTC" firstStartedPulling="2026-03-01 09:11:47.034734229 +0000 UTC m=+236.276613426" lastFinishedPulling="2026-03-01 09:12:50.17500971 +0000 UTC m=+299.416888907" observedRunningTime="2026-03-01 09:12:50.671391943 +0000 UTC m=+299.913271150" watchObservedRunningTime="2026-03-01 09:12:50.67288009 +0000 UTC m=+299.914759287" Mar 01 09:12:51 crc kubenswrapper[4792]: I0301 09:12:51.664263 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zwt8t" event={"ID":"fee8fc8f-8d72-4606-b115-4197f599cfcb","Type":"ContainerStarted","Data":"f48560c9b3d2ef7a82a59fb15bc8fde5c80832322a811c783edfd4310dd071e6"} Mar 01 09:12:51 crc kubenswrapper[4792]: I0301 09:12:51.667263 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ftk4v" event={"ID":"e93e87c0-86c5-446d-9f43-71d17960a351","Type":"ContainerStarted","Data":"66fe92889b2819aa449c7b25e42dabb8e422bdcf054e9c103637bfad90b589e3"} Mar 01 09:12:51 crc kubenswrapper[4792]: I0301 09:12:51.681793 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zwt8t" podStartSLOduration=2.398941363 podStartE2EDuration="1m6.68177938s" podCreationTimestamp="2026-03-01 09:11:45 +0000 UTC" firstStartedPulling="2026-03-01 09:11:46.998521037 +0000 UTC m=+236.240400234" lastFinishedPulling="2026-03-01 09:12:51.281359054 +0000 UTC m=+300.523238251" observedRunningTime="2026-03-01 09:12:51.681698488 +0000 UTC m=+300.923577675" watchObservedRunningTime="2026-03-01 09:12:51.68177938 +0000 UTC m=+300.923658577" Mar 01 09:12:51 crc kubenswrapper[4792]: I0301 09:12:51.702836 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ftk4v" podStartSLOduration=2.6093556209999997 podStartE2EDuration="1m5.702820915s" podCreationTimestamp="2026-03-01 09:11:46 +0000 UTC" firstStartedPulling="2026-03-01 09:11:48.072900337 +0000 UTC m=+237.314779534" lastFinishedPulling="2026-03-01 09:12:51.166365631 +0000 UTC m=+300.408244828" observedRunningTime="2026-03-01 09:12:51.701204775 +0000 UTC m=+300.943083972" watchObservedRunningTime="2026-03-01 09:12:51.702820915 +0000 UTC m=+300.944700112" Mar 01 09:12:52 crc kubenswrapper[4792]: I0301 09:12:52.678599 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l7ngd" event={"ID":"8333a325-229b-4dfd-a1f8-966f39bf55fc","Type":"ContainerStarted","Data":"fe153c70dda9f9d22a7d85fdfcdcada881b1df1b9449eeb2a850a3a6208653c3"} Mar 01 09:12:53 crc kubenswrapper[4792]: I0301 09:12:53.684618 4792 generic.go:334] "Generic (PLEG): container finished" podID="8333a325-229b-4dfd-a1f8-966f39bf55fc" containerID="fe153c70dda9f9d22a7d85fdfcdcada881b1df1b9449eeb2a850a3a6208653c3" exitCode=0 Mar 01 09:12:53 crc kubenswrapper[4792]: I0301 09:12:53.684663 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l7ngd" event={"ID":"8333a325-229b-4dfd-a1f8-966f39bf55fc","Type":"ContainerDied","Data":"fe153c70dda9f9d22a7d85fdfcdcada881b1df1b9449eeb2a850a3a6208653c3"} Mar 01 09:12:54 crc kubenswrapper[4792]: I0301 09:12:54.690732 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l7ngd" event={"ID":"8333a325-229b-4dfd-a1f8-966f39bf55fc","Type":"ContainerStarted","Data":"eb013fd09760d3456bbf7d33bd14056334305ee37d0672c8be7735153ad43a65"} Mar 01 09:12:55 crc kubenswrapper[4792]: I0301 09:12:55.147224 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-n28r8" Mar 01 09:12:55 crc kubenswrapper[4792]: I0301 09:12:55.147311 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-n28r8" Mar 01 09:12:55 crc kubenswrapper[4792]: I0301 09:12:55.430790 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-l7ngd" podStartSLOduration=2.768727987 podStartE2EDuration="1m12.430774877s" podCreationTimestamp="2026-03-01 09:11:43 +0000 UTC" firstStartedPulling="2026-03-01 09:11:44.809652582 +0000 UTC m=+234.051531779" lastFinishedPulling="2026-03-01 09:12:54.471699472 +0000 UTC m=+303.713578669" observedRunningTime="2026-03-01 09:12:54.71137109 +0000 UTC m=+303.953250287" watchObservedRunningTime="2026-03-01 09:12:55.430774877 +0000 UTC m=+304.672654074" Mar 01 09:12:55 crc kubenswrapper[4792]: I0301 09:12:55.517258 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-n28r8" Mar 01 09:12:55 crc kubenswrapper[4792]: I0301 09:12:55.543531 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zwt8t" Mar 01 09:12:55 crc kubenswrapper[4792]: I0301 09:12:55.543575 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zwt8t" Mar 01 09:12:55 crc kubenswrapper[4792]: I0301 09:12:55.587754 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zwt8t" Mar 01 09:12:55 crc kubenswrapper[4792]: I0301 09:12:55.734492 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-n28r8" Mar 01 09:12:56 crc kubenswrapper[4792]: I0301 09:12:56.624042 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ftk4v" Mar 01 09:12:56 crc kubenswrapper[4792]: I0301 09:12:56.625498 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ftk4v" Mar 01 09:12:56 crc kubenswrapper[4792]: I0301 09:12:56.703565 4792 generic.go:334] "Generic (PLEG): container finished" podID="9073e3da-2d6f-48a3-907a-e347f28559ae" containerID="4b4748f2f641b6c36501b4a63c23878d8149ab901a48ba6ca75290682667f801" exitCode=0 Mar 01 09:12:56 crc kubenswrapper[4792]: I0301 09:12:56.704598 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wxb87" event={"ID":"9073e3da-2d6f-48a3-907a-e347f28559ae","Type":"ContainerDied","Data":"4b4748f2f641b6c36501b4a63c23878d8149ab901a48ba6ca75290682667f801"} Mar 01 09:12:57 crc kubenswrapper[4792]: I0301 09:12:57.665290 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ftk4v" podUID="e93e87c0-86c5-446d-9f43-71d17960a351" containerName="registry-server" probeResult="failure" output=< Mar 01 09:12:57 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 09:12:57 crc kubenswrapper[4792]: > Mar 01 09:12:57 crc kubenswrapper[4792]: I0301 09:12:57.709932 4792 generic.go:334] "Generic (PLEG): container finished" podID="dff0d675-52dd-4cac-a7be-8750333c28e3" containerID="60c5c14884f306ec02b79c73b52f33a5df66be0f71db3ee1b928c0b932dd06d6" exitCode=0 Mar 01 09:12:57 crc kubenswrapper[4792]: I0301 09:12:57.710009 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cw675" event={"ID":"dff0d675-52dd-4cac-a7be-8750333c28e3","Type":"ContainerDied","Data":"60c5c14884f306ec02b79c73b52f33a5df66be0f71db3ee1b928c0b932dd06d6"} Mar 01 09:12:57 crc kubenswrapper[4792]: I0301 09:12:57.711799 4792 generic.go:334] "Generic (PLEG): container finished" podID="6fd91972-6bfc-4041-abc2-8f4298584603" containerID="303602554c4af893edadfe462be0a7b315bc51a29028b034dfe41a16ef93ff3e" exitCode=0 Mar 01 09:12:57 crc kubenswrapper[4792]: I0301 09:12:57.711830 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6b9w" event={"ID":"6fd91972-6bfc-4041-abc2-8f4298584603","Type":"ContainerDied","Data":"303602554c4af893edadfe462be0a7b315bc51a29028b034dfe41a16ef93ff3e"} Mar 01 09:12:57 crc kubenswrapper[4792]: I0301 09:12:57.716134 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wxb87" event={"ID":"9073e3da-2d6f-48a3-907a-e347f28559ae","Type":"ContainerStarted","Data":"9dea4572b2cae8fb3cd81b4d5fe3e648bf003be8e11b2fe3e243bf7601f66f32"} Mar 01 09:12:57 crc kubenswrapper[4792]: I0301 09:12:57.785874 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wxb87" podStartSLOduration=3.445852139 podStartE2EDuration="1m15.785850213s" podCreationTimestamp="2026-03-01 09:11:42 +0000 UTC" firstStartedPulling="2026-03-01 09:11:44.807849498 +0000 UTC m=+234.049728695" lastFinishedPulling="2026-03-01 09:12:57.147847572 +0000 UTC m=+306.389726769" observedRunningTime="2026-03-01 09:12:57.780737045 +0000 UTC m=+307.022616252" watchObservedRunningTime="2026-03-01 09:12:57.785850213 +0000 UTC m=+307.027729420" Mar 01 09:12:57 crc kubenswrapper[4792]: I0301 09:12:57.866796 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-848d8759d6-kmmxk"] Mar 01 09:12:57 crc kubenswrapper[4792]: I0301 09:12:57.867047 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" podUID="1761bae5-8e03-478f-938c-df41041a062c" containerName="controller-manager" containerID="cri-o://c6c92a3162c61d2929d59a565ad6956d560ccb189eb5098f9d86e3c3e62e3291" gracePeriod=30 Mar 01 09:12:57 crc kubenswrapper[4792]: I0301 09:12:57.976881 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn"] Mar 01 09:12:57 crc kubenswrapper[4792]: I0301 09:12:57.977122 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" podUID="810605ea-bf2f-4cd2-87a9-a09e9d5e7110" containerName="route-controller-manager" containerID="cri-o://6c2c0cfa98afb0d975335d6142781b6265b7d1ccb65a4ca10172168101d7316d" gracePeriod=30 Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.737312 4792 generic.go:334] "Generic (PLEG): container finished" podID="810605ea-bf2f-4cd2-87a9-a09e9d5e7110" containerID="6c2c0cfa98afb0d975335d6142781b6265b7d1ccb65a4ca10172168101d7316d" exitCode=0 Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.737402 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" event={"ID":"810605ea-bf2f-4cd2-87a9-a09e9d5e7110","Type":"ContainerDied","Data":"6c2c0cfa98afb0d975335d6142781b6265b7d1ccb65a4ca10172168101d7316d"} Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.740978 4792 generic.go:334] "Generic (PLEG): container finished" podID="1761bae5-8e03-478f-938c-df41041a062c" containerID="c6c92a3162c61d2929d59a565ad6956d560ccb189eb5098f9d86e3c3e62e3291" exitCode=0 Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.741041 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" event={"ID":"1761bae5-8e03-478f-938c-df41041a062c","Type":"ContainerDied","Data":"c6c92a3162c61d2929d59a565ad6956d560ccb189eb5098f9d86e3c3e62e3291"} Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.877456 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.910130 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk"] Mar 01 09:12:59 crc kubenswrapper[4792]: E0301 09:12:59.910362 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d574e82-f840-4f0c-982d-f6a133bd64ae" containerName="oc" Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.910376 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d574e82-f840-4f0c-982d-f6a133bd64ae" containerName="oc" Mar 01 09:12:59 crc kubenswrapper[4792]: E0301 09:12:59.910391 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15c3e52b-97f9-45e8-a7ba-c360739547e7" containerName="pruner" Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.910398 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="15c3e52b-97f9-45e8-a7ba-c360739547e7" containerName="pruner" Mar 01 09:12:59 crc kubenswrapper[4792]: E0301 09:12:59.910412 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4130507-2de2-48c2-9c3f-e9474aeca556" containerName="oc" Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.910420 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4130507-2de2-48c2-9c3f-e9474aeca556" containerName="oc" Mar 01 09:12:59 crc kubenswrapper[4792]: E0301 09:12:59.910437 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="810605ea-bf2f-4cd2-87a9-a09e9d5e7110" containerName="route-controller-manager" Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.910445 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="810605ea-bf2f-4cd2-87a9-a09e9d5e7110" containerName="route-controller-manager" Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.910558 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4130507-2de2-48c2-9c3f-e9474aeca556" containerName="oc" Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.910574 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="810605ea-bf2f-4cd2-87a9-a09e9d5e7110" containerName="route-controller-manager" Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.910585 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d574e82-f840-4f0c-982d-f6a133bd64ae" containerName="oc" Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.910600 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="15c3e52b-97f9-45e8-a7ba-c360739547e7" containerName="pruner" Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.912406 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.926997 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.929880 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk"] Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.944537 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-client-ca\") pod \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\" (UID: \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\") " Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.944588 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pz4dk\" (UniqueName: \"kubernetes.io/projected/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-kube-api-access-pz4dk\") pod \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\" (UID: \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\") " Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.944610 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-config\") pod \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\" (UID: \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\") " Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.944649 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-serving-cert\") pod \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\" (UID: \"810605ea-bf2f-4cd2-87a9-a09e9d5e7110\") " Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.945187 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-client-ca" (OuterVolumeSpecName: "client-ca") pod "810605ea-bf2f-4cd2-87a9-a09e9d5e7110" (UID: "810605ea-bf2f-4cd2-87a9-a09e9d5e7110"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.945439 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-config" (OuterVolumeSpecName: "config") pod "810605ea-bf2f-4cd2-87a9-a09e9d5e7110" (UID: "810605ea-bf2f-4cd2-87a9-a09e9d5e7110"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.951041 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "810605ea-bf2f-4cd2-87a9-a09e9d5e7110" (UID: "810605ea-bf2f-4cd2-87a9-a09e9d5e7110"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:12:59 crc kubenswrapper[4792]: I0301 09:12:59.966130 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-kube-api-access-pz4dk" (OuterVolumeSpecName: "kube-api-access-pz4dk") pod "810605ea-bf2f-4cd2-87a9-a09e9d5e7110" (UID: "810605ea-bf2f-4cd2-87a9-a09e9d5e7110"). InnerVolumeSpecName "kube-api-access-pz4dk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.046279 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1761bae5-8e03-478f-938c-df41041a062c-client-ca\") pod \"1761bae5-8e03-478f-938c-df41041a062c\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.046348 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l558v\" (UniqueName: \"kubernetes.io/projected/1761bae5-8e03-478f-938c-df41041a062c-kube-api-access-l558v\") pod \"1761bae5-8e03-478f-938c-df41041a062c\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.046384 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1761bae5-8e03-478f-938c-df41041a062c-proxy-ca-bundles\") pod \"1761bae5-8e03-478f-938c-df41041a062c\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.046430 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1761bae5-8e03-478f-938c-df41041a062c-serving-cert\") pod \"1761bae5-8e03-478f-938c-df41041a062c\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.046456 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1761bae5-8e03-478f-938c-df41041a062c-config\") pod \"1761bae5-8e03-478f-938c-df41041a062c\" (UID: \"1761bae5-8e03-478f-938c-df41041a062c\") " Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.046630 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f32059d6-24aa-4b5a-afb2-b2cf75704d53-config\") pod \"route-controller-manager-79b74d7999-nwdjk\" (UID: \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\") " pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.046697 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f32059d6-24aa-4b5a-afb2-b2cf75704d53-client-ca\") pod \"route-controller-manager-79b74d7999-nwdjk\" (UID: \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\") " pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.046724 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f32059d6-24aa-4b5a-afb2-b2cf75704d53-serving-cert\") pod \"route-controller-manager-79b74d7999-nwdjk\" (UID: \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\") " pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.046748 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chxj8\" (UniqueName: \"kubernetes.io/projected/f32059d6-24aa-4b5a-afb2-b2cf75704d53-kube-api-access-chxj8\") pod \"route-controller-manager-79b74d7999-nwdjk\" (UID: \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\") " pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.046802 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-client-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.046818 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pz4dk\" (UniqueName: \"kubernetes.io/projected/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-kube-api-access-pz4dk\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.046830 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.046841 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/810605ea-bf2f-4cd2-87a9-a09e9d5e7110-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.047715 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1761bae5-8e03-478f-938c-df41041a062c-client-ca" (OuterVolumeSpecName: "client-ca") pod "1761bae5-8e03-478f-938c-df41041a062c" (UID: "1761bae5-8e03-478f-938c-df41041a062c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.047732 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1761bae5-8e03-478f-938c-df41041a062c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1761bae5-8e03-478f-938c-df41041a062c" (UID: "1761bae5-8e03-478f-938c-df41041a062c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.047859 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1761bae5-8e03-478f-938c-df41041a062c-config" (OuterVolumeSpecName: "config") pod "1761bae5-8e03-478f-938c-df41041a062c" (UID: "1761bae5-8e03-478f-938c-df41041a062c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.049653 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1761bae5-8e03-478f-938c-df41041a062c-kube-api-access-l558v" (OuterVolumeSpecName: "kube-api-access-l558v") pod "1761bae5-8e03-478f-938c-df41041a062c" (UID: "1761bae5-8e03-478f-938c-df41041a062c"). InnerVolumeSpecName "kube-api-access-l558v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.049808 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1761bae5-8e03-478f-938c-df41041a062c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1761bae5-8e03-478f-938c-df41041a062c" (UID: "1761bae5-8e03-478f-938c-df41041a062c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.147758 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f32059d6-24aa-4b5a-afb2-b2cf75704d53-client-ca\") pod \"route-controller-manager-79b74d7999-nwdjk\" (UID: \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\") " pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.147808 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f32059d6-24aa-4b5a-afb2-b2cf75704d53-serving-cert\") pod \"route-controller-manager-79b74d7999-nwdjk\" (UID: \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\") " pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.147830 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chxj8\" (UniqueName: \"kubernetes.io/projected/f32059d6-24aa-4b5a-afb2-b2cf75704d53-kube-api-access-chxj8\") pod \"route-controller-manager-79b74d7999-nwdjk\" (UID: \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\") " pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.147895 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f32059d6-24aa-4b5a-afb2-b2cf75704d53-config\") pod \"route-controller-manager-79b74d7999-nwdjk\" (UID: \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\") " pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.147977 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1761bae5-8e03-478f-938c-df41041a062c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.147999 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l558v\" (UniqueName: \"kubernetes.io/projected/1761bae5-8e03-478f-938c-df41041a062c-kube-api-access-l558v\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.148013 4792 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1761bae5-8e03-478f-938c-df41041a062c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.148023 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1761bae5-8e03-478f-938c-df41041a062c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.148033 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1761bae5-8e03-478f-938c-df41041a062c-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.149284 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f32059d6-24aa-4b5a-afb2-b2cf75704d53-config\") pod \"route-controller-manager-79b74d7999-nwdjk\" (UID: \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\") " pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.151206 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f32059d6-24aa-4b5a-afb2-b2cf75704d53-client-ca\") pod \"route-controller-manager-79b74d7999-nwdjk\" (UID: \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\") " pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.156917 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f32059d6-24aa-4b5a-afb2-b2cf75704d53-serving-cert\") pod \"route-controller-manager-79b74d7999-nwdjk\" (UID: \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\") " pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.162930 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chxj8\" (UniqueName: \"kubernetes.io/projected/f32059d6-24aa-4b5a-afb2-b2cf75704d53-kube-api-access-chxj8\") pod \"route-controller-manager-79b74d7999-nwdjk\" (UID: \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\") " pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.241473 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.444876 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk"] Mar 01 09:13:00 crc kubenswrapper[4792]: W0301 09:13:00.451299 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf32059d6_24aa_4b5a_afb2_b2cf75704d53.slice/crio-8e9b5bce7a19f5245dcb156079f150381ad068104bb67fabd9a693966fe2fe05 WatchSource:0}: Error finding container 8e9b5bce7a19f5245dcb156079f150381ad068104bb67fabd9a693966fe2fe05: Status 404 returned error can't find the container with id 8e9b5bce7a19f5245dcb156079f150381ad068104bb67fabd9a693966fe2fe05 Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.754999 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" event={"ID":"f32059d6-24aa-4b5a-afb2-b2cf75704d53","Type":"ContainerStarted","Data":"8e9b5bce7a19f5245dcb156079f150381ad068104bb67fabd9a693966fe2fe05"} Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.756251 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" event={"ID":"810605ea-bf2f-4cd2-87a9-a09e9d5e7110","Type":"ContainerDied","Data":"ec7c242400c3788f70c2a91c8d7378bde577b0383ee21a7d115406b72e0bfbe5"} Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.756281 4792 scope.go:117] "RemoveContainer" containerID="6c2c0cfa98afb0d975335d6142781b6265b7d1ccb65a4ca10172168101d7316d" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.756373 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.759328 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" event={"ID":"1761bae5-8e03-478f-938c-df41041a062c","Type":"ContainerDied","Data":"7470d72dfdecb7555879e3e75bbc32cdde09ed41a6f66ab3ea84fdfcb9d1191a"} Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.759518 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.786379 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn"] Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.789985 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6947bdc57d-4m2vn"] Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.799045 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-848d8759d6-kmmxk"] Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.801566 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-848d8759d6-kmmxk"] Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.812371 4792 patch_prober.go:28] interesting pod/controller-manager-848d8759d6-kmmxk container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 01 09:13:00 crc kubenswrapper[4792]: I0301 09:13:00.812408 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-848d8759d6-kmmxk" podUID="1761bae5-8e03-478f-938c-df41041a062c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 01 09:13:01 crc kubenswrapper[4792]: I0301 09:13:01.416697 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1761bae5-8e03-478f-938c-df41041a062c" path="/var/lib/kubelet/pods/1761bae5-8e03-478f-938c-df41041a062c/volumes" Mar 01 09:13:01 crc kubenswrapper[4792]: I0301 09:13:01.418091 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="810605ea-bf2f-4cd2-87a9-a09e9d5e7110" path="/var/lib/kubelet/pods/810605ea-bf2f-4cd2-87a9-a09e9d5e7110/volumes" Mar 01 09:13:01 crc kubenswrapper[4792]: I0301 09:13:01.484378 4792 scope.go:117] "RemoveContainer" containerID="c6c92a3162c61d2929d59a565ad6956d560ccb189eb5098f9d86e3c3e62e3291" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.458721 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-774f47c464-sjvj2"] Mar 01 09:13:02 crc kubenswrapper[4792]: E0301 09:13:02.459318 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1761bae5-8e03-478f-938c-df41041a062c" containerName="controller-manager" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.459334 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1761bae5-8e03-478f-938c-df41041a062c" containerName="controller-manager" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.459456 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1761bae5-8e03-478f-938c-df41041a062c" containerName="controller-manager" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.459887 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.462369 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.462626 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.462768 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.462797 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.462881 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.464987 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.473057 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.479759 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-774f47c464-sjvj2"] Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.579175 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb03c466-db7d-4cc2-9744-fb90116eac6f-serving-cert\") pod \"controller-manager-774f47c464-sjvj2\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.579255 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb03c466-db7d-4cc2-9744-fb90116eac6f-client-ca\") pod \"controller-manager-774f47c464-sjvj2\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.579357 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb03c466-db7d-4cc2-9744-fb90116eac6f-config\") pod \"controller-manager-774f47c464-sjvj2\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.579461 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk7hf\" (UniqueName: \"kubernetes.io/projected/cb03c466-db7d-4cc2-9744-fb90116eac6f-kube-api-access-wk7hf\") pod \"controller-manager-774f47c464-sjvj2\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.579509 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb03c466-db7d-4cc2-9744-fb90116eac6f-proxy-ca-bundles\") pod \"controller-manager-774f47c464-sjvj2\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.680828 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk7hf\" (UniqueName: \"kubernetes.io/projected/cb03c466-db7d-4cc2-9744-fb90116eac6f-kube-api-access-wk7hf\") pod \"controller-manager-774f47c464-sjvj2\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.680975 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb03c466-db7d-4cc2-9744-fb90116eac6f-proxy-ca-bundles\") pod \"controller-manager-774f47c464-sjvj2\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.681094 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb03c466-db7d-4cc2-9744-fb90116eac6f-serving-cert\") pod \"controller-manager-774f47c464-sjvj2\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.681947 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb03c466-db7d-4cc2-9744-fb90116eac6f-proxy-ca-bundles\") pod \"controller-manager-774f47c464-sjvj2\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.682485 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb03c466-db7d-4cc2-9744-fb90116eac6f-client-ca\") pod \"controller-manager-774f47c464-sjvj2\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.683701 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb03c466-db7d-4cc2-9744-fb90116eac6f-client-ca\") pod \"controller-manager-774f47c464-sjvj2\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.684053 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb03c466-db7d-4cc2-9744-fb90116eac6f-config\") pod \"controller-manager-774f47c464-sjvj2\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.686214 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb03c466-db7d-4cc2-9744-fb90116eac6f-config\") pod \"controller-manager-774f47c464-sjvj2\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.687401 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb03c466-db7d-4cc2-9744-fb90116eac6f-serving-cert\") pod \"controller-manager-774f47c464-sjvj2\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.699185 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk7hf\" (UniqueName: \"kubernetes.io/projected/cb03c466-db7d-4cc2-9744-fb90116eac6f-kube-api-access-wk7hf\") pod \"controller-manager-774f47c464-sjvj2\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:02 crc kubenswrapper[4792]: I0301 09:13:02.779401 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:03 crc kubenswrapper[4792]: I0301 09:13:03.021585 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wxb87" Mar 01 09:13:03 crc kubenswrapper[4792]: I0301 09:13:03.022169 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wxb87" Mar 01 09:13:03 crc kubenswrapper[4792]: I0301 09:13:03.074216 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wxb87" Mar 01 09:13:03 crc kubenswrapper[4792]: I0301 09:13:03.564256 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-l7ngd" Mar 01 09:13:03 crc kubenswrapper[4792]: I0301 09:13:03.564405 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-l7ngd" Mar 01 09:13:03 crc kubenswrapper[4792]: I0301 09:13:03.620643 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-l7ngd" Mar 01 09:13:03 crc kubenswrapper[4792]: I0301 09:13:03.819103 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-l7ngd" Mar 01 09:13:03 crc kubenswrapper[4792]: I0301 09:13:03.830189 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wxb87" Mar 01 09:13:04 crc kubenswrapper[4792]: I0301 09:13:04.942556 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:13:04 crc kubenswrapper[4792]: I0301 09:13:04.942608 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:13:04 crc kubenswrapper[4792]: I0301 09:13:04.942647 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:13:04 crc kubenswrapper[4792]: I0301 09:13:04.943148 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac"} pod="openshift-machine-config-operator/machine-config-daemon-bqszv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 01 09:13:04 crc kubenswrapper[4792]: I0301 09:13:04.943199 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" containerID="cri-o://47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac" gracePeriod=600 Mar 01 09:13:05 crc kubenswrapper[4792]: I0301 09:13:05.605538 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zwt8t" Mar 01 09:13:06 crc kubenswrapper[4792]: I0301 09:13:06.660225 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ftk4v" Mar 01 09:13:06 crc kubenswrapper[4792]: I0301 09:13:06.698891 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ftk4v" Mar 01 09:13:06 crc kubenswrapper[4792]: I0301 09:13:06.794656 4792 generic.go:334] "Generic (PLEG): container finished" podID="9105f6b0-6f16-47aa-8009-73736a90b765" containerID="47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac" exitCode=0 Mar 01 09:13:06 crc kubenswrapper[4792]: I0301 09:13:06.794993 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerDied","Data":"47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac"} Mar 01 09:13:07 crc kubenswrapper[4792]: I0301 09:13:07.864118 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l7ngd"] Mar 01 09:13:07 crc kubenswrapper[4792]: I0301 09:13:07.864488 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-l7ngd" podUID="8333a325-229b-4dfd-a1f8-966f39bf55fc" containerName="registry-server" containerID="cri-o://eb013fd09760d3456bbf7d33bd14056334305ee37d0672c8be7735153ad43a65" gracePeriod=2 Mar 01 09:13:08 crc kubenswrapper[4792]: I0301 09:13:08.065212 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zwt8t"] Mar 01 09:13:08 crc kubenswrapper[4792]: I0301 09:13:08.065460 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zwt8t" podUID="fee8fc8f-8d72-4606-b115-4197f599cfcb" containerName="registry-server" containerID="cri-o://f48560c9b3d2ef7a82a59fb15bc8fde5c80832322a811c783edfd4310dd071e6" gracePeriod=2 Mar 01 09:13:08 crc kubenswrapper[4792]: I0301 09:13:08.302222 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" podUID="020a8218-62f4-4abf-a8d2-fed602de5f7f" containerName="oauth-openshift" containerID="cri-o://63337be8a65e2fc4ccac6c7a4ef78cbcdcc5f66cc06d8b699c08445a9271a940" gracePeriod=15 Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.812830 4792 generic.go:334] "Generic (PLEG): container finished" podID="fee8fc8f-8d72-4606-b115-4197f599cfcb" containerID="f48560c9b3d2ef7a82a59fb15bc8fde5c80832322a811c783edfd4310dd071e6" exitCode=0 Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.813220 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zwt8t" event={"ID":"fee8fc8f-8d72-4606-b115-4197f599cfcb","Type":"ContainerDied","Data":"f48560c9b3d2ef7a82a59fb15bc8fde5c80832322a811c783edfd4310dd071e6"} Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.815427 4792 generic.go:334] "Generic (PLEG): container finished" podID="8333a325-229b-4dfd-a1f8-966f39bf55fc" containerID="eb013fd09760d3456bbf7d33bd14056334305ee37d0672c8be7735153ad43a65" exitCode=0 Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.815483 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l7ngd" event={"ID":"8333a325-229b-4dfd-a1f8-966f39bf55fc","Type":"ContainerDied","Data":"eb013fd09760d3456bbf7d33bd14056334305ee37d0672c8be7735153ad43a65"} Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.817009 4792 generic.go:334] "Generic (PLEG): container finished" podID="020a8218-62f4-4abf-a8d2-fed602de5f7f" containerID="63337be8a65e2fc4ccac6c7a4ef78cbcdcc5f66cc06d8b699c08445a9271a940" exitCode=0 Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.817038 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" event={"ID":"020a8218-62f4-4abf-a8d2-fed602de5f7f","Type":"ContainerDied","Data":"63337be8a65e2fc4ccac6c7a4ef78cbcdcc5f66cc06d8b699c08445a9271a940"} Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.889357 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.973361 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-ocp-branding-template\") pod \"020a8218-62f4-4abf-a8d2-fed602de5f7f\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.974500 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-template-provider-selection\") pod \"020a8218-62f4-4abf-a8d2-fed602de5f7f\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.974560 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-cliconfig\") pod \"020a8218-62f4-4abf-a8d2-fed602de5f7f\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.974578 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55wpx\" (UniqueName: \"kubernetes.io/projected/020a8218-62f4-4abf-a8d2-fed602de5f7f-kube-api-access-55wpx\") pod \"020a8218-62f4-4abf-a8d2-fed602de5f7f\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.974599 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/020a8218-62f4-4abf-a8d2-fed602de5f7f-audit-dir\") pod \"020a8218-62f4-4abf-a8d2-fed602de5f7f\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.974633 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-idp-0-file-data\") pod \"020a8218-62f4-4abf-a8d2-fed602de5f7f\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.974661 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-trusted-ca-bundle\") pod \"020a8218-62f4-4abf-a8d2-fed602de5f7f\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.974680 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-router-certs\") pod \"020a8218-62f4-4abf-a8d2-fed602de5f7f\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.976171 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-audit-policies\") pod \"020a8218-62f4-4abf-a8d2-fed602de5f7f\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.976228 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-service-ca\") pod \"020a8218-62f4-4abf-a8d2-fed602de5f7f\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.976244 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-template-login\") pod \"020a8218-62f4-4abf-a8d2-fed602de5f7f\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.974833 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/020a8218-62f4-4abf-a8d2-fed602de5f7f-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "020a8218-62f4-4abf-a8d2-fed602de5f7f" (UID: "020a8218-62f4-4abf-a8d2-fed602de5f7f"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.976265 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-serving-cert\") pod \"020a8218-62f4-4abf-a8d2-fed602de5f7f\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.975308 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "020a8218-62f4-4abf-a8d2-fed602de5f7f" (UID: "020a8218-62f4-4abf-a8d2-fed602de5f7f"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.976215 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "020a8218-62f4-4abf-a8d2-fed602de5f7f" (UID: "020a8218-62f4-4abf-a8d2-fed602de5f7f"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.976285 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-template-error\") pod \"020a8218-62f4-4abf-a8d2-fed602de5f7f\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.976441 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-session\") pod \"020a8218-62f4-4abf-a8d2-fed602de5f7f\" (UID: \"020a8218-62f4-4abf-a8d2-fed602de5f7f\") " Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.976754 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "020a8218-62f4-4abf-a8d2-fed602de5f7f" (UID: "020a8218-62f4-4abf-a8d2-fed602de5f7f"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.977137 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "020a8218-62f4-4abf-a8d2-fed602de5f7f" (UID: "020a8218-62f4-4abf-a8d2-fed602de5f7f"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.977334 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.977438 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.977462 4792 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/020a8218-62f4-4abf-a8d2-fed602de5f7f-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.977482 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.977500 4792 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/020a8218-62f4-4abf-a8d2-fed602de5f7f-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.980379 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "020a8218-62f4-4abf-a8d2-fed602de5f7f" (UID: "020a8218-62f4-4abf-a8d2-fed602de5f7f"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.981491 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "020a8218-62f4-4abf-a8d2-fed602de5f7f" (UID: "020a8218-62f4-4abf-a8d2-fed602de5f7f"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.982205 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "020a8218-62f4-4abf-a8d2-fed602de5f7f" (UID: "020a8218-62f4-4abf-a8d2-fed602de5f7f"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.983885 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/020a8218-62f4-4abf-a8d2-fed602de5f7f-kube-api-access-55wpx" (OuterVolumeSpecName: "kube-api-access-55wpx") pod "020a8218-62f4-4abf-a8d2-fed602de5f7f" (UID: "020a8218-62f4-4abf-a8d2-fed602de5f7f"). InnerVolumeSpecName "kube-api-access-55wpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.985495 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "020a8218-62f4-4abf-a8d2-fed602de5f7f" (UID: "020a8218-62f4-4abf-a8d2-fed602de5f7f"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.986162 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "020a8218-62f4-4abf-a8d2-fed602de5f7f" (UID: "020a8218-62f4-4abf-a8d2-fed602de5f7f"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.987278 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "020a8218-62f4-4abf-a8d2-fed602de5f7f" (UID: "020a8218-62f4-4abf-a8d2-fed602de5f7f"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.987457 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "020a8218-62f4-4abf-a8d2-fed602de5f7f" (UID: "020a8218-62f4-4abf-a8d2-fed602de5f7f"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:13:09 crc kubenswrapper[4792]: I0301 09:13:09.987656 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "020a8218-62f4-4abf-a8d2-fed602de5f7f" (UID: "020a8218-62f4-4abf-a8d2-fed602de5f7f"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.078409 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.078445 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.078458 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.078471 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.078486 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.078500 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.078514 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55wpx\" (UniqueName: \"kubernetes.io/projected/020a8218-62f4-4abf-a8d2-fed602de5f7f-kube-api-access-55wpx\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.078527 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.078538 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/020a8218-62f4-4abf-a8d2-fed602de5f7f-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.254801 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-774f47c464-sjvj2"] Mar 01 09:13:10 crc kubenswrapper[4792]: W0301 09:13:10.283999 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb03c466_db7d_4cc2_9744_fb90116eac6f.slice/crio-2782208b0d666f0e62b1d1210bb9b0f8f9d295fb7d4cff04f1224273c092f525 WatchSource:0}: Error finding container 2782208b0d666f0e62b1d1210bb9b0f8f9d295fb7d4cff04f1224273c092f525: Status 404 returned error can't find the container with id 2782208b0d666f0e62b1d1210bb9b0f8f9d295fb7d4cff04f1224273c092f525 Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.315119 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zwt8t" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.352585 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l7ngd" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.381590 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fee8fc8f-8d72-4606-b115-4197f599cfcb-utilities\") pod \"fee8fc8f-8d72-4606-b115-4197f599cfcb\" (UID: \"fee8fc8f-8d72-4606-b115-4197f599cfcb\") " Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.381634 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjlvw\" (UniqueName: \"kubernetes.io/projected/fee8fc8f-8d72-4606-b115-4197f599cfcb-kube-api-access-mjlvw\") pod \"fee8fc8f-8d72-4606-b115-4197f599cfcb\" (UID: \"fee8fc8f-8d72-4606-b115-4197f599cfcb\") " Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.381702 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fee8fc8f-8d72-4606-b115-4197f599cfcb-catalog-content\") pod \"fee8fc8f-8d72-4606-b115-4197f599cfcb\" (UID: \"fee8fc8f-8d72-4606-b115-4197f599cfcb\") " Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.382860 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fee8fc8f-8d72-4606-b115-4197f599cfcb-utilities" (OuterVolumeSpecName: "utilities") pod "fee8fc8f-8d72-4606-b115-4197f599cfcb" (UID: "fee8fc8f-8d72-4606-b115-4197f599cfcb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.387278 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fee8fc8f-8d72-4606-b115-4197f599cfcb-kube-api-access-mjlvw" (OuterVolumeSpecName: "kube-api-access-mjlvw") pod "fee8fc8f-8d72-4606-b115-4197f599cfcb" (UID: "fee8fc8f-8d72-4606-b115-4197f599cfcb"). InnerVolumeSpecName "kube-api-access-mjlvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.408636 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fee8fc8f-8d72-4606-b115-4197f599cfcb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fee8fc8f-8d72-4606-b115-4197f599cfcb" (UID: "fee8fc8f-8d72-4606-b115-4197f599cfcb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.467620 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ftk4v"] Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.467890 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ftk4v" podUID="e93e87c0-86c5-446d-9f43-71d17960a351" containerName="registry-server" containerID="cri-o://66fe92889b2819aa449c7b25e42dabb8e422bdcf054e9c103637bfad90b589e3" gracePeriod=2 Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.481856 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8"] Mar 01 09:13:10 crc kubenswrapper[4792]: E0301 09:13:10.482114 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8333a325-229b-4dfd-a1f8-966f39bf55fc" containerName="extract-content" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.482134 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8333a325-229b-4dfd-a1f8-966f39bf55fc" containerName="extract-content" Mar 01 09:13:10 crc kubenswrapper[4792]: E0301 09:13:10.482147 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8333a325-229b-4dfd-a1f8-966f39bf55fc" containerName="registry-server" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.482155 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8333a325-229b-4dfd-a1f8-966f39bf55fc" containerName="registry-server" Mar 01 09:13:10 crc kubenswrapper[4792]: E0301 09:13:10.482166 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee8fc8f-8d72-4606-b115-4197f599cfcb" containerName="extract-utilities" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.482178 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee8fc8f-8d72-4606-b115-4197f599cfcb" containerName="extract-utilities" Mar 01 09:13:10 crc kubenswrapper[4792]: E0301 09:13:10.482193 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8333a325-229b-4dfd-a1f8-966f39bf55fc" containerName="extract-utilities" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.482203 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8333a325-229b-4dfd-a1f8-966f39bf55fc" containerName="extract-utilities" Mar 01 09:13:10 crc kubenswrapper[4792]: E0301 09:13:10.482222 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee8fc8f-8d72-4606-b115-4197f599cfcb" containerName="extract-content" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.482231 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee8fc8f-8d72-4606-b115-4197f599cfcb" containerName="extract-content" Mar 01 09:13:10 crc kubenswrapper[4792]: E0301 09:13:10.482243 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="020a8218-62f4-4abf-a8d2-fed602de5f7f" containerName="oauth-openshift" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.482251 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="020a8218-62f4-4abf-a8d2-fed602de5f7f" containerName="oauth-openshift" Mar 01 09:13:10 crc kubenswrapper[4792]: E0301 09:13:10.482259 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee8fc8f-8d72-4606-b115-4197f599cfcb" containerName="registry-server" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.482266 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee8fc8f-8d72-4606-b115-4197f599cfcb" containerName="registry-server" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.482364 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="8333a325-229b-4dfd-a1f8-966f39bf55fc" containerName="registry-server" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.482379 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="020a8218-62f4-4abf-a8d2-fed602de5f7f" containerName="oauth-openshift" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.482392 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="fee8fc8f-8d72-4606-b115-4197f599cfcb" containerName="registry-server" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.483010 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8333a325-229b-4dfd-a1f8-966f39bf55fc-catalog-content\") pod \"8333a325-229b-4dfd-a1f8-966f39bf55fc\" (UID: \"8333a325-229b-4dfd-a1f8-966f39bf55fc\") " Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.483062 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jn884\" (UniqueName: \"kubernetes.io/projected/8333a325-229b-4dfd-a1f8-966f39bf55fc-kube-api-access-jn884\") pod \"8333a325-229b-4dfd-a1f8-966f39bf55fc\" (UID: \"8333a325-229b-4dfd-a1f8-966f39bf55fc\") " Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.483092 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8333a325-229b-4dfd-a1f8-966f39bf55fc-utilities\") pod \"8333a325-229b-4dfd-a1f8-966f39bf55fc\" (UID: \"8333a325-229b-4dfd-a1f8-966f39bf55fc\") " Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.483413 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fee8fc8f-8d72-4606-b115-4197f599cfcb-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.483435 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjlvw\" (UniqueName: \"kubernetes.io/projected/fee8fc8f-8d72-4606-b115-4197f599cfcb-kube-api-access-mjlvw\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.483449 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fee8fc8f-8d72-4606-b115-4197f599cfcb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.484234 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8333a325-229b-4dfd-a1f8-966f39bf55fc-utilities" (OuterVolumeSpecName: "utilities") pod "8333a325-229b-4dfd-a1f8-966f39bf55fc" (UID: "8333a325-229b-4dfd-a1f8-966f39bf55fc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.484432 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.490597 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8"] Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.490950 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8333a325-229b-4dfd-a1f8-966f39bf55fc-kube-api-access-jn884" (OuterVolumeSpecName: "kube-api-access-jn884") pod "8333a325-229b-4dfd-a1f8-966f39bf55fc" (UID: "8333a325-229b-4dfd-a1f8-966f39bf55fc"). InnerVolumeSpecName "kube-api-access-jn884". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.551738 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8333a325-229b-4dfd-a1f8-966f39bf55fc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8333a325-229b-4dfd-a1f8-966f39bf55fc" (UID: "8333a325-229b-4dfd-a1f8-966f39bf55fc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.584804 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.584860 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-cliconfig\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.584886 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-router-certs\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.584966 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-session\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.584990 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-user-template-login\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.585017 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/584dbcf3-9289-47c3-a556-0418b670cb21-audit-dir\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.585193 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.585271 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-user-template-error\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.585293 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmvs6\" (UniqueName: \"kubernetes.io/projected/584dbcf3-9289-47c3-a556-0418b670cb21-kube-api-access-cmvs6\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.585322 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-service-ca\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.585408 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.585445 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.585508 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-serving-cert\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.585553 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/584dbcf3-9289-47c3-a556-0418b670cb21-audit-policies\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.585644 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jn884\" (UniqueName: \"kubernetes.io/projected/8333a325-229b-4dfd-a1f8-966f39bf55fc-kube-api-access-jn884\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.585662 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8333a325-229b-4dfd-a1f8-966f39bf55fc-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.585674 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8333a325-229b-4dfd-a1f8-966f39bf55fc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.686174 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-cliconfig\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.686222 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-router-certs\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.686252 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-session\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.686269 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-user-template-login\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.686290 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/584dbcf3-9289-47c3-a556-0418b670cb21-audit-dir\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.686311 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.686334 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-user-template-error\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.686351 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmvs6\" (UniqueName: \"kubernetes.io/projected/584dbcf3-9289-47c3-a556-0418b670cb21-kube-api-access-cmvs6\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.686368 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-service-ca\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.686386 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.686402 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.686426 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-serving-cert\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.686445 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/584dbcf3-9289-47c3-a556-0418b670cb21-audit-policies\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.686466 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.688252 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/584dbcf3-9289-47c3-a556-0418b670cb21-audit-policies\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.688288 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/584dbcf3-9289-47c3-a556-0418b670cb21-audit-dir\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.689010 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.688402 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-service-ca\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.689757 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-session\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.689842 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-cliconfig\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.690922 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-router-certs\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.691243 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-serving-cert\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.691838 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.692151 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.692198 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.692504 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-user-template-error\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.693998 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/584dbcf3-9289-47c3-a556-0418b670cb21-v4-0-config-user-template-login\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.711022 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmvs6\" (UniqueName: \"kubernetes.io/projected/584dbcf3-9289-47c3-a556-0418b670cb21-kube-api-access-cmvs6\") pod \"oauth-openshift-58b6dc46cc-pd7g8\" (UID: \"584dbcf3-9289-47c3-a556-0418b670cb21\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.809705 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.825082 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zwt8t" event={"ID":"fee8fc8f-8d72-4606-b115-4197f599cfcb","Type":"ContainerDied","Data":"6a4a6d1ca04b5a791e8cc232ac1bcb86844593ceb569a67c479055eb35e8caec"} Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.825131 4792 scope.go:117] "RemoveContainer" containerID="f48560c9b3d2ef7a82a59fb15bc8fde5c80832322a811c783edfd4310dd071e6" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.825149 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zwt8t" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.830160 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l7ngd" event={"ID":"8333a325-229b-4dfd-a1f8-966f39bf55fc","Type":"ContainerDied","Data":"90fafa3ed5c527a4e521d0e12598d211c5be0e011b432479d9a243fe086b9d81"} Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.830219 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l7ngd" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.832789 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" event={"ID":"020a8218-62f4-4abf-a8d2-fed602de5f7f","Type":"ContainerDied","Data":"87d106e344fa51bb8e5e92cf97b7c6070e8daa571dd37784f078b0bfdb5ba165"} Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.832882 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-prqqp" Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.837134 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" event={"ID":"cb03c466-db7d-4cc2-9744-fb90116eac6f","Type":"ContainerStarted","Data":"2782208b0d666f0e62b1d1210bb9b0f8f9d295fb7d4cff04f1224273c092f525"} Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.909153 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zwt8t"] Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.952529 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zwt8t"] Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.964826 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l7ngd"] Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.973189 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-l7ngd"] Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.985670 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-prqqp"] Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.988777 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-prqqp"] Mar 01 09:13:10 crc kubenswrapper[4792]: I0301 09:13:10.992520 4792 scope.go:117] "RemoveContainer" containerID="8da1a6a75bb09923b49fb00136c53f3ad6da4b84f38958d2ae68061cac2e183c" Mar 01 09:13:11 crc kubenswrapper[4792]: I0301 09:13:11.024289 4792 scope.go:117] "RemoveContainer" containerID="e49f509c95cf50378f216231453383d3c1c559a790286ac1d1a0c0a75d1546f4" Mar 01 09:13:11 crc kubenswrapper[4792]: I0301 09:13:11.045760 4792 scope.go:117] "RemoveContainer" containerID="eb013fd09760d3456bbf7d33bd14056334305ee37d0672c8be7735153ad43a65" Mar 01 09:13:11 crc kubenswrapper[4792]: I0301 09:13:11.059549 4792 scope.go:117] "RemoveContainer" containerID="fe153c70dda9f9d22a7d85fdfcdcada881b1df1b9449eeb2a850a3a6208653c3" Mar 01 09:13:11 crc kubenswrapper[4792]: I0301 09:13:11.083021 4792 scope.go:117] "RemoveContainer" containerID="ea0440bd6858d820e5ab5d60d7085504804067b9ef42039aabaf07d6f0cd7730" Mar 01 09:13:11 crc kubenswrapper[4792]: I0301 09:13:11.092030 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8"] Mar 01 09:13:11 crc kubenswrapper[4792]: I0301 09:13:11.098305 4792 scope.go:117] "RemoveContainer" containerID="63337be8a65e2fc4ccac6c7a4ef78cbcdcc5f66cc06d8b699c08445a9271a940" Mar 01 09:13:11 crc kubenswrapper[4792]: W0301 09:13:11.101181 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod584dbcf3_9289_47c3_a556_0418b670cb21.slice/crio-ddee0dfea9750e94ae58bfc7dc992b0bfdb5fd94b13629a83b3b0f431dc26676 WatchSource:0}: Error finding container ddee0dfea9750e94ae58bfc7dc992b0bfdb5fd94b13629a83b3b0f431dc26676: Status 404 returned error can't find the container with id ddee0dfea9750e94ae58bfc7dc992b0bfdb5fd94b13629a83b3b0f431dc26676 Mar 01 09:13:11 crc kubenswrapper[4792]: I0301 09:13:11.416082 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="020a8218-62f4-4abf-a8d2-fed602de5f7f" path="/var/lib/kubelet/pods/020a8218-62f4-4abf-a8d2-fed602de5f7f/volumes" Mar 01 09:13:11 crc kubenswrapper[4792]: I0301 09:13:11.416874 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8333a325-229b-4dfd-a1f8-966f39bf55fc" path="/var/lib/kubelet/pods/8333a325-229b-4dfd-a1f8-966f39bf55fc/volumes" Mar 01 09:13:11 crc kubenswrapper[4792]: I0301 09:13:11.417425 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fee8fc8f-8d72-4606-b115-4197f599cfcb" path="/var/lib/kubelet/pods/fee8fc8f-8d72-4606-b115-4197f599cfcb/volumes" Mar 01 09:13:11 crc kubenswrapper[4792]: I0301 09:13:11.843700 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" event={"ID":"cb03c466-db7d-4cc2-9744-fb90116eac6f","Type":"ContainerStarted","Data":"fe3861f72d22b8b2894aaf7d7bf6d91d16cd803c0347d640e158b07a91ef553f"} Mar 01 09:13:11 crc kubenswrapper[4792]: I0301 09:13:11.845386 4792 generic.go:334] "Generic (PLEG): container finished" podID="e93e87c0-86c5-446d-9f43-71d17960a351" containerID="66fe92889b2819aa449c7b25e42dabb8e422bdcf054e9c103637bfad90b589e3" exitCode=0 Mar 01 09:13:11 crc kubenswrapper[4792]: I0301 09:13:11.845439 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ftk4v" event={"ID":"e93e87c0-86c5-446d-9f43-71d17960a351","Type":"ContainerDied","Data":"66fe92889b2819aa449c7b25e42dabb8e422bdcf054e9c103637bfad90b589e3"} Mar 01 09:13:11 crc kubenswrapper[4792]: I0301 09:13:11.846506 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" event={"ID":"f32059d6-24aa-4b5a-afb2-b2cf75704d53","Type":"ContainerStarted","Data":"f1a8ea691a13ea588efa820347a57b92c8419278817b613643f842ce395bc050"} Mar 01 09:13:11 crc kubenswrapper[4792]: I0301 09:13:11.848078 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7nwc2" event={"ID":"22c7c368-3523-4224-aebd-59b29640bed0","Type":"ContainerStarted","Data":"225f228a4c6b7dec18ef0a0e39f231e76893c213f11c4f32a59ac86f3a689ed1"} Mar 01 09:13:11 crc kubenswrapper[4792]: I0301 09:13:11.849646 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6b9w" event={"ID":"6fd91972-6bfc-4041-abc2-8f4298584603","Type":"ContainerStarted","Data":"9b1ebd40c76d6a05f41f90e7e6ff9f66cc3e0ca75b2cbd8232430f1380f165ea"} Mar 01 09:13:11 crc kubenswrapper[4792]: I0301 09:13:11.853103 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"d953fb6bdd1dde8b39f7c850e5987057bc7f87ba2081744a1e12425c6ecc8289"} Mar 01 09:13:11 crc kubenswrapper[4792]: I0301 09:13:11.855416 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cw675" event={"ID":"dff0d675-52dd-4cac-a7be-8750333c28e3","Type":"ContainerStarted","Data":"5bb5a5f0949169743626acf007f4aa939920851854703451752f35c1714d5b63"} Mar 01 09:13:11 crc kubenswrapper[4792]: I0301 09:13:11.856347 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" event={"ID":"584dbcf3-9289-47c3-a556-0418b670cb21","Type":"ContainerStarted","Data":"ddee0dfea9750e94ae58bfc7dc992b0bfdb5fd94b13629a83b3b0f431dc26676"} Mar 01 09:13:12 crc kubenswrapper[4792]: I0301 09:13:12.863267 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" event={"ID":"584dbcf3-9289-47c3-a556-0418b670cb21","Type":"ContainerStarted","Data":"af637ff35f9f2e8b44705b6017f39b85dcd173a564912f90d5eeac99c1dbbae3"} Mar 01 09:13:12 crc kubenswrapper[4792]: I0301 09:13:12.865681 4792 generic.go:334] "Generic (PLEG): container finished" podID="22c7c368-3523-4224-aebd-59b29640bed0" containerID="225f228a4c6b7dec18ef0a0e39f231e76893c213f11c4f32a59ac86f3a689ed1" exitCode=0 Mar 01 09:13:12 crc kubenswrapper[4792]: I0301 09:13:12.865826 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7nwc2" event={"ID":"22c7c368-3523-4224-aebd-59b29640bed0","Type":"ContainerDied","Data":"225f228a4c6b7dec18ef0a0e39f231e76893c213f11c4f32a59ac86f3a689ed1"} Mar 01 09:13:12 crc kubenswrapper[4792]: I0301 09:13:12.866330 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:12 crc kubenswrapper[4792]: I0301 09:13:12.872031 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:12 crc kubenswrapper[4792]: I0301 09:13:12.926644 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" podStartSLOduration=14.926627477 podStartE2EDuration="14.926627477s" podCreationTimestamp="2026-03-01 09:12:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:13:12.922538875 +0000 UTC m=+322.164418082" watchObservedRunningTime="2026-03-01 09:13:12.926627477 +0000 UTC m=+322.168506674" Mar 01 09:13:12 crc kubenswrapper[4792]: I0301 09:13:12.952593 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p6b9w" podStartSLOduration=4.616126009 podStartE2EDuration="1m29.952579725s" podCreationTimestamp="2026-03-01 09:11:43 +0000 UTC" firstStartedPulling="2026-03-01 09:11:44.804786092 +0000 UTC m=+234.046665289" lastFinishedPulling="2026-03-01 09:13:10.141239798 +0000 UTC m=+319.383119005" observedRunningTime="2026-03-01 09:13:12.948950805 +0000 UTC m=+322.190830002" watchObservedRunningTime="2026-03-01 09:13:12.952579725 +0000 UTC m=+322.194458922" Mar 01 09:13:12 crc kubenswrapper[4792]: I0301 09:13:12.973959 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cw675" podStartSLOduration=5.503586673 podStartE2EDuration="1m30.973937679s" podCreationTimestamp="2026-03-01 09:11:42 +0000 UTC" firstStartedPulling="2026-03-01 09:11:44.787979238 +0000 UTC m=+234.029858435" lastFinishedPulling="2026-03-01 09:13:10.258330244 +0000 UTC m=+319.500209441" observedRunningTime="2026-03-01 09:13:12.971323274 +0000 UTC m=+322.213202471" watchObservedRunningTime="2026-03-01 09:13:12.973937679 +0000 UTC m=+322.215816866" Mar 01 09:13:12 crc kubenswrapper[4792]: I0301 09:13:12.989495 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" podStartSLOduration=15.989480677 podStartE2EDuration="15.989480677s" podCreationTimestamp="2026-03-01 09:12:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:13:12.988994325 +0000 UTC m=+322.230873522" watchObservedRunningTime="2026-03-01 09:13:12.989480677 +0000 UTC m=+322.231359874" Mar 01 09:13:13 crc kubenswrapper[4792]: I0301 09:13:13.229045 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cw675" Mar 01 09:13:13 crc kubenswrapper[4792]: I0301 09:13:13.229225 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cw675" Mar 01 09:13:13 crc kubenswrapper[4792]: I0301 09:13:13.269494 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ftk4v" Mar 01 09:13:13 crc kubenswrapper[4792]: I0301 09:13:13.316442 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tbnw\" (UniqueName: \"kubernetes.io/projected/e93e87c0-86c5-446d-9f43-71d17960a351-kube-api-access-4tbnw\") pod \"e93e87c0-86c5-446d-9f43-71d17960a351\" (UID: \"e93e87c0-86c5-446d-9f43-71d17960a351\") " Mar 01 09:13:13 crc kubenswrapper[4792]: I0301 09:13:13.316510 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e93e87c0-86c5-446d-9f43-71d17960a351-utilities\") pod \"e93e87c0-86c5-446d-9f43-71d17960a351\" (UID: \"e93e87c0-86c5-446d-9f43-71d17960a351\") " Mar 01 09:13:13 crc kubenswrapper[4792]: I0301 09:13:13.316547 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e93e87c0-86c5-446d-9f43-71d17960a351-catalog-content\") pod \"e93e87c0-86c5-446d-9f43-71d17960a351\" (UID: \"e93e87c0-86c5-446d-9f43-71d17960a351\") " Mar 01 09:13:13 crc kubenswrapper[4792]: I0301 09:13:13.320732 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e93e87c0-86c5-446d-9f43-71d17960a351-utilities" (OuterVolumeSpecName: "utilities") pod "e93e87c0-86c5-446d-9f43-71d17960a351" (UID: "e93e87c0-86c5-446d-9f43-71d17960a351"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:13:13 crc kubenswrapper[4792]: I0301 09:13:13.329079 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e93e87c0-86c5-446d-9f43-71d17960a351-kube-api-access-4tbnw" (OuterVolumeSpecName: "kube-api-access-4tbnw") pod "e93e87c0-86c5-446d-9f43-71d17960a351" (UID: "e93e87c0-86c5-446d-9f43-71d17960a351"). InnerVolumeSpecName "kube-api-access-4tbnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:13:13 crc kubenswrapper[4792]: I0301 09:13:13.361851 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p6b9w" Mar 01 09:13:13 crc kubenswrapper[4792]: I0301 09:13:13.362084 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p6b9w" Mar 01 09:13:13 crc kubenswrapper[4792]: I0301 09:13:13.417668 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tbnw\" (UniqueName: \"kubernetes.io/projected/e93e87c0-86c5-446d-9f43-71d17960a351-kube-api-access-4tbnw\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:13 crc kubenswrapper[4792]: I0301 09:13:13.417711 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e93e87c0-86c5-446d-9f43-71d17960a351-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:13 crc kubenswrapper[4792]: I0301 09:13:13.873519 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ftk4v" event={"ID":"e93e87c0-86c5-446d-9f43-71d17960a351","Type":"ContainerDied","Data":"4d930733fc56c620ebbeeb1e0668d704b5df033f141d5439fd47f1e4757bacb0"} Mar 01 09:13:13 crc kubenswrapper[4792]: I0301 09:13:13.873785 4792 scope.go:117] "RemoveContainer" containerID="66fe92889b2819aa449c7b25e42dabb8e422bdcf054e9c103637bfad90b589e3" Mar 01 09:13:13 crc kubenswrapper[4792]: I0301 09:13:13.874552 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ftk4v" Mar 01 09:13:13 crc kubenswrapper[4792]: I0301 09:13:13.888201 4792 scope.go:117] "RemoveContainer" containerID="4ae585f0378f09e8557ad2a19a0474c0c5f7679c6fe6f603a65e86c389831b72" Mar 01 09:13:13 crc kubenswrapper[4792]: I0301 09:13:13.897636 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" podStartSLOduration=30.897613349 podStartE2EDuration="30.897613349s" podCreationTimestamp="2026-03-01 09:12:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:13:13.892386209 +0000 UTC m=+323.134265406" watchObservedRunningTime="2026-03-01 09:13:13.897613349 +0000 UTC m=+323.139492546" Mar 01 09:13:13 crc kubenswrapper[4792]: I0301 09:13:13.904453 4792 scope.go:117] "RemoveContainer" containerID="c36d8345eeaa99ca3b40ff37727a6e09aab9d36c26bc0292d9c0f3405bd1fe17" Mar 01 09:13:14 crc kubenswrapper[4792]: I0301 09:13:14.038407 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e93e87c0-86c5-446d-9f43-71d17960a351-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e93e87c0-86c5-446d-9f43-71d17960a351" (UID: "e93e87c0-86c5-446d-9f43-71d17960a351"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:13:14 crc kubenswrapper[4792]: I0301 09:13:14.126015 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e93e87c0-86c5-446d-9f43-71d17960a351-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:14 crc kubenswrapper[4792]: I0301 09:13:14.206021 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ftk4v"] Mar 01 09:13:14 crc kubenswrapper[4792]: I0301 09:13:14.208866 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ftk4v"] Mar 01 09:13:14 crc kubenswrapper[4792]: I0301 09:13:14.283585 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-cw675" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" containerName="registry-server" probeResult="failure" output=< Mar 01 09:13:14 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 09:13:14 crc kubenswrapper[4792]: > Mar 01 09:13:14 crc kubenswrapper[4792]: I0301 09:13:14.398636 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-p6b9w" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" containerName="registry-server" probeResult="failure" output=< Mar 01 09:13:14 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 09:13:14 crc kubenswrapper[4792]: > Mar 01 09:13:14 crc kubenswrapper[4792]: I0301 09:13:14.886733 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7nwc2" event={"ID":"22c7c368-3523-4224-aebd-59b29640bed0","Type":"ContainerStarted","Data":"6796f3254defaa157a9986218237b224b14d5a54cd6e180bf82e9634212462ec"} Mar 01 09:13:15 crc kubenswrapper[4792]: I0301 09:13:15.420317 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e93e87c0-86c5-446d-9f43-71d17960a351" path="/var/lib/kubelet/pods/e93e87c0-86c5-446d-9f43-71d17960a351/volumes" Mar 01 09:13:16 crc kubenswrapper[4792]: I0301 09:13:16.188745 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7nwc2" Mar 01 09:13:16 crc kubenswrapper[4792]: I0301 09:13:16.188790 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7nwc2" Mar 01 09:13:17 crc kubenswrapper[4792]: I0301 09:13:17.231323 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7nwc2" podUID="22c7c368-3523-4224-aebd-59b29640bed0" containerName="registry-server" probeResult="failure" output=< Mar 01 09:13:17 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 09:13:17 crc kubenswrapper[4792]: > Mar 01 09:13:17 crc kubenswrapper[4792]: I0301 09:13:17.865476 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7nwc2" podStartSLOduration=6.796994658 podStartE2EDuration="1m32.86543761s" podCreationTimestamp="2026-03-01 09:11:45 +0000 UTC" firstStartedPulling="2026-03-01 09:11:48.04378161 +0000 UTC m=+237.285660807" lastFinishedPulling="2026-03-01 09:13:14.112224562 +0000 UTC m=+323.354103759" observedRunningTime="2026-03-01 09:13:14.915538944 +0000 UTC m=+324.157418141" watchObservedRunningTime="2026-03-01 09:13:17.86543761 +0000 UTC m=+327.107316807" Mar 01 09:13:17 crc kubenswrapper[4792]: I0301 09:13:17.870194 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-774f47c464-sjvj2"] Mar 01 09:13:17 crc kubenswrapper[4792]: I0301 09:13:17.870427 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" podUID="cb03c466-db7d-4cc2-9744-fb90116eac6f" containerName="controller-manager" containerID="cri-o://fe3861f72d22b8b2894aaf7d7bf6d91d16cd803c0347d640e158b07a91ef553f" gracePeriod=30 Mar 01 09:13:17 crc kubenswrapper[4792]: I0301 09:13:17.933898 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk"] Mar 01 09:13:17 crc kubenswrapper[4792]: I0301 09:13:17.934095 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" podUID="f32059d6-24aa-4b5a-afb2-b2cf75704d53" containerName="route-controller-manager" containerID="cri-o://f1a8ea691a13ea588efa820347a57b92c8419278817b613643f842ce395bc050" gracePeriod=30 Mar 01 09:13:17 crc kubenswrapper[4792]: I0301 09:13:17.934836 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" Mar 01 09:13:17 crc kubenswrapper[4792]: I0301 09:13:17.940648 4792 patch_prober.go:28] interesting pod/route-controller-manager-79b74d7999-nwdjk container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.63:8443/healthz\": read tcp 10.217.0.2:38504->10.217.0.63:8443: read: connection reset by peer" start-of-body= Mar 01 09:13:17 crc kubenswrapper[4792]: I0301 09:13:17.940696 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" podUID="f32059d6-24aa-4b5a-afb2-b2cf75704d53" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.63:8443/healthz\": read tcp 10.217.0.2:38504->10.217.0.63:8443: read: connection reset by peer" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.515817 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.521373 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.601713 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb03c466-db7d-4cc2-9744-fb90116eac6f-proxy-ca-bundles\") pod \"cb03c466-db7d-4cc2-9744-fb90116eac6f\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.601784 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chxj8\" (UniqueName: \"kubernetes.io/projected/f32059d6-24aa-4b5a-afb2-b2cf75704d53-kube-api-access-chxj8\") pod \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\" (UID: \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\") " Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.601814 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f32059d6-24aa-4b5a-afb2-b2cf75704d53-client-ca\") pod \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\" (UID: \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\") " Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.601860 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk7hf\" (UniqueName: \"kubernetes.io/projected/cb03c466-db7d-4cc2-9744-fb90116eac6f-kube-api-access-wk7hf\") pod \"cb03c466-db7d-4cc2-9744-fb90116eac6f\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.601951 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb03c466-db7d-4cc2-9744-fb90116eac6f-config\") pod \"cb03c466-db7d-4cc2-9744-fb90116eac6f\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.601975 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb03c466-db7d-4cc2-9744-fb90116eac6f-serving-cert\") pod \"cb03c466-db7d-4cc2-9744-fb90116eac6f\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.601995 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f32059d6-24aa-4b5a-afb2-b2cf75704d53-serving-cert\") pod \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\" (UID: \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\") " Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.602042 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f32059d6-24aa-4b5a-afb2-b2cf75704d53-config\") pod \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\" (UID: \"f32059d6-24aa-4b5a-afb2-b2cf75704d53\") " Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.602063 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb03c466-db7d-4cc2-9744-fb90116eac6f-client-ca\") pod \"cb03c466-db7d-4cc2-9744-fb90116eac6f\" (UID: \"cb03c466-db7d-4cc2-9744-fb90116eac6f\") " Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.602644 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb03c466-db7d-4cc2-9744-fb90116eac6f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "cb03c466-db7d-4cc2-9744-fb90116eac6f" (UID: "cb03c466-db7d-4cc2-9744-fb90116eac6f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.602861 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb03c466-db7d-4cc2-9744-fb90116eac6f-client-ca" (OuterVolumeSpecName: "client-ca") pod "cb03c466-db7d-4cc2-9744-fb90116eac6f" (UID: "cb03c466-db7d-4cc2-9744-fb90116eac6f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.603545 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f32059d6-24aa-4b5a-afb2-b2cf75704d53-client-ca" (OuterVolumeSpecName: "client-ca") pod "f32059d6-24aa-4b5a-afb2-b2cf75704d53" (UID: "f32059d6-24aa-4b5a-afb2-b2cf75704d53"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.603562 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb03c466-db7d-4cc2-9744-fb90116eac6f-config" (OuterVolumeSpecName: "config") pod "cb03c466-db7d-4cc2-9744-fb90116eac6f" (UID: "cb03c466-db7d-4cc2-9744-fb90116eac6f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.604185 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f32059d6-24aa-4b5a-afb2-b2cf75704d53-config" (OuterVolumeSpecName: "config") pod "f32059d6-24aa-4b5a-afb2-b2cf75704d53" (UID: "f32059d6-24aa-4b5a-afb2-b2cf75704d53"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.607366 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb03c466-db7d-4cc2-9744-fb90116eac6f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cb03c466-db7d-4cc2-9744-fb90116eac6f" (UID: "cb03c466-db7d-4cc2-9744-fb90116eac6f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.607492 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f32059d6-24aa-4b5a-afb2-b2cf75704d53-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f32059d6-24aa-4b5a-afb2-b2cf75704d53" (UID: "f32059d6-24aa-4b5a-afb2-b2cf75704d53"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.607609 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f32059d6-24aa-4b5a-afb2-b2cf75704d53-kube-api-access-chxj8" (OuterVolumeSpecName: "kube-api-access-chxj8") pod "f32059d6-24aa-4b5a-afb2-b2cf75704d53" (UID: "f32059d6-24aa-4b5a-afb2-b2cf75704d53"). InnerVolumeSpecName "kube-api-access-chxj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.608001 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb03c466-db7d-4cc2-9744-fb90116eac6f-kube-api-access-wk7hf" (OuterVolumeSpecName: "kube-api-access-wk7hf") pod "cb03c466-db7d-4cc2-9744-fb90116eac6f" (UID: "cb03c466-db7d-4cc2-9744-fb90116eac6f"). InnerVolumeSpecName "kube-api-access-wk7hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.703955 4792 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb03c466-db7d-4cc2-9744-fb90116eac6f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.704015 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chxj8\" (UniqueName: \"kubernetes.io/projected/f32059d6-24aa-4b5a-afb2-b2cf75704d53-kube-api-access-chxj8\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.704040 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f32059d6-24aa-4b5a-afb2-b2cf75704d53-client-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.704058 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wk7hf\" (UniqueName: \"kubernetes.io/projected/cb03c466-db7d-4cc2-9744-fb90116eac6f-kube-api-access-wk7hf\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.704078 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb03c466-db7d-4cc2-9744-fb90116eac6f-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.704095 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb03c466-db7d-4cc2-9744-fb90116eac6f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.704112 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f32059d6-24aa-4b5a-afb2-b2cf75704d53-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.704129 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f32059d6-24aa-4b5a-afb2-b2cf75704d53-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.704145 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb03c466-db7d-4cc2-9744-fb90116eac6f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.912614 4792 generic.go:334] "Generic (PLEG): container finished" podID="cb03c466-db7d-4cc2-9744-fb90116eac6f" containerID="fe3861f72d22b8b2894aaf7d7bf6d91d16cd803c0347d640e158b07a91ef553f" exitCode=0 Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.912704 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" event={"ID":"cb03c466-db7d-4cc2-9744-fb90116eac6f","Type":"ContainerDied","Data":"fe3861f72d22b8b2894aaf7d7bf6d91d16cd803c0347d640e158b07a91ef553f"} Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.912771 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.913830 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774f47c464-sjvj2" event={"ID":"cb03c466-db7d-4cc2-9744-fb90116eac6f","Type":"ContainerDied","Data":"2782208b0d666f0e62b1d1210bb9b0f8f9d295fb7d4cff04f1224273c092f525"} Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.913875 4792 scope.go:117] "RemoveContainer" containerID="fe3861f72d22b8b2894aaf7d7bf6d91d16cd803c0347d640e158b07a91ef553f" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.915514 4792 generic.go:334] "Generic (PLEG): container finished" podID="f32059d6-24aa-4b5a-afb2-b2cf75704d53" containerID="f1a8ea691a13ea588efa820347a57b92c8419278817b613643f842ce395bc050" exitCode=0 Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.915553 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" event={"ID":"f32059d6-24aa-4b5a-afb2-b2cf75704d53","Type":"ContainerDied","Data":"f1a8ea691a13ea588efa820347a57b92c8419278817b613643f842ce395bc050"} Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.915579 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" event={"ID":"f32059d6-24aa-4b5a-afb2-b2cf75704d53","Type":"ContainerDied","Data":"8e9b5bce7a19f5245dcb156079f150381ad068104bb67fabd9a693966fe2fe05"} Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.915584 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.948415 4792 scope.go:117] "RemoveContainer" containerID="fe3861f72d22b8b2894aaf7d7bf6d91d16cd803c0347d640e158b07a91ef553f" Mar 01 09:13:18 crc kubenswrapper[4792]: E0301 09:13:18.949532 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe3861f72d22b8b2894aaf7d7bf6d91d16cd803c0347d640e158b07a91ef553f\": container with ID starting with fe3861f72d22b8b2894aaf7d7bf6d91d16cd803c0347d640e158b07a91ef553f not found: ID does not exist" containerID="fe3861f72d22b8b2894aaf7d7bf6d91d16cd803c0347d640e158b07a91ef553f" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.949582 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe3861f72d22b8b2894aaf7d7bf6d91d16cd803c0347d640e158b07a91ef553f"} err="failed to get container status \"fe3861f72d22b8b2894aaf7d7bf6d91d16cd803c0347d640e158b07a91ef553f\": rpc error: code = NotFound desc = could not find container \"fe3861f72d22b8b2894aaf7d7bf6d91d16cd803c0347d640e158b07a91ef553f\": container with ID starting with fe3861f72d22b8b2894aaf7d7bf6d91d16cd803c0347d640e158b07a91ef553f not found: ID does not exist" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.949611 4792 scope.go:117] "RemoveContainer" containerID="f1a8ea691a13ea588efa820347a57b92c8419278817b613643f842ce395bc050" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.971033 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-774f47c464-sjvj2"] Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.973227 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-774f47c464-sjvj2"] Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.974551 4792 scope.go:117] "RemoveContainer" containerID="f1a8ea691a13ea588efa820347a57b92c8419278817b613643f842ce395bc050" Mar 01 09:13:18 crc kubenswrapper[4792]: E0301 09:13:18.975125 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1a8ea691a13ea588efa820347a57b92c8419278817b613643f842ce395bc050\": container with ID starting with f1a8ea691a13ea588efa820347a57b92c8419278817b613643f842ce395bc050 not found: ID does not exist" containerID="f1a8ea691a13ea588efa820347a57b92c8419278817b613643f842ce395bc050" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.975203 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1a8ea691a13ea588efa820347a57b92c8419278817b613643f842ce395bc050"} err="failed to get container status \"f1a8ea691a13ea588efa820347a57b92c8419278817b613643f842ce395bc050\": rpc error: code = NotFound desc = could not find container \"f1a8ea691a13ea588efa820347a57b92c8419278817b613643f842ce395bc050\": container with ID starting with f1a8ea691a13ea588efa820347a57b92c8419278817b613643f842ce395bc050 not found: ID does not exist" Mar 01 09:13:18 crc kubenswrapper[4792]: I0301 09:13:18.993074 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk"] Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.002150 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79b74d7999-nwdjk"] Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.161226 4792 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 01 09:13:19 crc kubenswrapper[4792]: E0301 09:13:19.161468 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e93e87c0-86c5-446d-9f43-71d17960a351" containerName="registry-server" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.161487 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93e87c0-86c5-446d-9f43-71d17960a351" containerName="registry-server" Mar 01 09:13:19 crc kubenswrapper[4792]: E0301 09:13:19.161504 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e93e87c0-86c5-446d-9f43-71d17960a351" containerName="extract-content" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.161510 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93e87c0-86c5-446d-9f43-71d17960a351" containerName="extract-content" Mar 01 09:13:19 crc kubenswrapper[4792]: E0301 09:13:19.161518 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f32059d6-24aa-4b5a-afb2-b2cf75704d53" containerName="route-controller-manager" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.161524 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f32059d6-24aa-4b5a-afb2-b2cf75704d53" containerName="route-controller-manager" Mar 01 09:13:19 crc kubenswrapper[4792]: E0301 09:13:19.161531 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e93e87c0-86c5-446d-9f43-71d17960a351" containerName="extract-utilities" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.161538 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93e87c0-86c5-446d-9f43-71d17960a351" containerName="extract-utilities" Mar 01 09:13:19 crc kubenswrapper[4792]: E0301 09:13:19.161546 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb03c466-db7d-4cc2-9744-fb90116eac6f" containerName="controller-manager" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.161552 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb03c466-db7d-4cc2-9744-fb90116eac6f" containerName="controller-manager" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.161652 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb03c466-db7d-4cc2-9744-fb90116eac6f" containerName="controller-manager" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.161663 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f32059d6-24aa-4b5a-afb2-b2cf75704d53" containerName="route-controller-manager" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.161675 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e93e87c0-86c5-446d-9f43-71d17960a351" containerName="registry-server" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.162092 4792 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.162416 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610" gracePeriod=15 Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.162611 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.162951 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://4c38215135d7b0d135f95361664363266a76db5a039fae95c1e2507e52ee9f40" gracePeriod=15 Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.162996 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48" gracePeriod=15 Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163028 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1" gracePeriod=15 Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163061 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7" gracePeriod=15 Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163489 4792 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 01 09:13:19 crc kubenswrapper[4792]: E0301 09:13:19.163675 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163689 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 01 09:13:19 crc kubenswrapper[4792]: E0301 09:13:19.163704 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163710 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 01 09:13:19 crc kubenswrapper[4792]: E0301 09:13:19.163718 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163724 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 01 09:13:19 crc kubenswrapper[4792]: E0301 09:13:19.163731 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163737 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 01 09:13:19 crc kubenswrapper[4792]: E0301 09:13:19.163746 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163752 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 01 09:13:19 crc kubenswrapper[4792]: E0301 09:13:19.163761 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163767 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 01 09:13:19 crc kubenswrapper[4792]: E0301 09:13:19.163774 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163780 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 01 09:13:19 crc kubenswrapper[4792]: E0301 09:13:19.163788 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163794 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 01 09:13:19 crc kubenswrapper[4792]: E0301 09:13:19.163801 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163806 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 01 09:13:19 crc kubenswrapper[4792]: E0301 09:13:19.163817 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163836 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163928 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163938 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163945 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163952 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163960 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163968 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.163975 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 01 09:13:19 crc kubenswrapper[4792]: E0301 09:13:19.164066 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.164072 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.164154 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.164162 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.164169 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.228562 4792 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]log ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]api-openshift-apiserver-available ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]api-openshift-oauth-apiserver-available ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]informer-sync ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/generic-apiserver-start-informers ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/priority-and-fairness-filter ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/start-apiextensions-informers ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/start-apiextensions-controllers ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/crd-informer-synced ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/start-system-namespaces-controller ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/rbac/bootstrap-roles ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/bootstrap-controller ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/start-kube-aggregator-informers ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/apiservice-registration-controller ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/apiservice-discovery-controller ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]autoregister-completion ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/apiservice-openapi-controller ok Mar 01 09:13:19 crc kubenswrapper[4792]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 01 09:13:19 crc kubenswrapper[4792]: [-]shutdown failed: reason withheld Mar 01 09:13:19 crc kubenswrapper[4792]: readyz check failed Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.228627 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.238105 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.311275 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.311320 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.311358 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.311421 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.311490 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.311539 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.311566 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.311586 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.412241 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.412292 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.412320 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.412356 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.412370 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.412403 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.412391 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.412429 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.412376 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.412439 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.412479 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.412517 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.412541 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.412547 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.412555 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.412614 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.416017 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb03c466-db7d-4cc2-9744-fb90116eac6f" path="/var/lib/kubelet/pods/cb03c466-db7d-4cc2-9744-fb90116eac6f/volumes" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.416525 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f32059d6-24aa-4b5a-afb2-b2cf75704d53" path="/var/lib/kubelet/pods/f32059d6-24aa-4b5a-afb2-b2cf75704d53/volumes" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.535264 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:13:19 crc kubenswrapper[4792]: W0301 09:13:19.565403 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-be28ae32e2e32e88a62b899a8c7d88cd88d16451f547cb14206f079e225e827c WatchSource:0}: Error finding container be28ae32e2e32e88a62b899a8c7d88cd88d16451f547cb14206f079e225e827c: Status 404 returned error can't find the container with id be28ae32e2e32e88a62b899a8c7d88cd88d16451f547cb14206f079e225e827c Mar 01 09:13:19 crc kubenswrapper[4792]: E0301 09:13:19.569677 4792 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.89:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1898acc14bb41be2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:13:19.569050594 +0000 UTC m=+328.810929791,LastTimestamp:2026-03-01 09:13:19.569050594 +0000 UTC m=+328.810929791,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.923266 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.924578 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.925334 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4c38215135d7b0d135f95361664363266a76db5a039fae95c1e2507e52ee9f40" exitCode=0 Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.925358 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48" exitCode=0 Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.925369 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1" exitCode=0 Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.925379 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7" exitCode=2 Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.925462 4792 scope.go:117] "RemoveContainer" containerID="40016e70dd35e9a72a7746b106959e0ca2677d8d541a6f492db1731ead2b8ca2" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.928326 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"0053a8c0b15b85cc2122bda95d2798eb0ba27a1e086aa18527efd0c4e0caccae"} Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.928376 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"be28ae32e2e32e88a62b899a8c7d88cd88d16451f547cb14206f079e225e827c"} Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.929009 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.929589 4792 generic.go:334] "Generic (PLEG): container finished" podID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" containerID="7ee832f8488525fdd9e2872f5f8c217b74292b7dc7a5f5c0537fec99a3845c5f" exitCode=0 Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.929627 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"4bc9fd32-61ce-4fbb-b67e-1376102f5384","Type":"ContainerDied","Data":"7ee832f8488525fdd9e2872f5f8c217b74292b7dc7a5f5c0537fec99a3845c5f"} Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.930313 4792 status_manager.go:851] "Failed to get status for pod" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:19 crc kubenswrapper[4792]: I0301 09:13:19.930930 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:20 crc kubenswrapper[4792]: I0301 09:13:20.810713 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:20 crc kubenswrapper[4792]: I0301 09:13:20.816581 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" Mar 01 09:13:20 crc kubenswrapper[4792]: I0301 09:13:20.817266 4792 status_manager.go:851] "Failed to get status for pod" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:20 crc kubenswrapper[4792]: I0301 09:13:20.817975 4792 status_manager.go:851] "Failed to get status for pod" podUID="584dbcf3-9289-47c3-a556-0418b670cb21" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-58b6dc46cc-pd7g8\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:20 crc kubenswrapper[4792]: I0301 09:13:20.818439 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:20 crc kubenswrapper[4792]: I0301 09:13:20.942156 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.289853 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.290565 4792 status_manager.go:851] "Failed to get status for pod" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.290744 4792 status_manager.go:851] "Failed to get status for pod" podUID="584dbcf3-9289-47c3-a556-0418b670cb21" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-58b6dc46cc-pd7g8\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.290969 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.411555 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.412199 4792 status_manager.go:851] "Failed to get status for pod" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.412555 4792 status_manager.go:851] "Failed to get status for pod" podUID="584dbcf3-9289-47c3-a556-0418b670cb21" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-58b6dc46cc-pd7g8\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.446864 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4bc9fd32-61ce-4fbb-b67e-1376102f5384-kube-api-access\") pod \"4bc9fd32-61ce-4fbb-b67e-1376102f5384\" (UID: \"4bc9fd32-61ce-4fbb-b67e-1376102f5384\") " Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.446951 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4bc9fd32-61ce-4fbb-b67e-1376102f5384-kubelet-dir\") pod \"4bc9fd32-61ce-4fbb-b67e-1376102f5384\" (UID: \"4bc9fd32-61ce-4fbb-b67e-1376102f5384\") " Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.446980 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4bc9fd32-61ce-4fbb-b67e-1376102f5384-var-lock\") pod \"4bc9fd32-61ce-4fbb-b67e-1376102f5384\" (UID: \"4bc9fd32-61ce-4fbb-b67e-1376102f5384\") " Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.447198 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4bc9fd32-61ce-4fbb-b67e-1376102f5384-var-lock" (OuterVolumeSpecName: "var-lock") pod "4bc9fd32-61ce-4fbb-b67e-1376102f5384" (UID: "4bc9fd32-61ce-4fbb-b67e-1376102f5384"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.447315 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4bc9fd32-61ce-4fbb-b67e-1376102f5384-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4bc9fd32-61ce-4fbb-b67e-1376102f5384" (UID: "4bc9fd32-61ce-4fbb-b67e-1376102f5384"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.451723 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bc9fd32-61ce-4fbb-b67e-1376102f5384-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4bc9fd32-61ce-4fbb-b67e-1376102f5384" (UID: "4bc9fd32-61ce-4fbb-b67e-1376102f5384"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.548631 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4bc9fd32-61ce-4fbb-b67e-1376102f5384-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.548677 4792 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4bc9fd32-61ce-4fbb-b67e-1376102f5384-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.548685 4792 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4bc9fd32-61ce-4fbb-b67e-1376102f5384-var-lock\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.555404 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.556233 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.557210 4792 status_manager.go:851] "Failed to get status for pod" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.557811 4792 status_manager.go:851] "Failed to get status for pod" podUID="584dbcf3-9289-47c3-a556-0418b670cb21" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-58b6dc46cc-pd7g8\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.558611 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.559204 4792 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.651054 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.651938 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.652147 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.651144 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.652407 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.652535 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.653248 4792 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.653428 4792 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.653616 4792 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.953432 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.954553 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610" exitCode=0 Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.954745 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.955061 4792 scope.go:117] "RemoveContainer" containerID="4c38215135d7b0d135f95361664363266a76db5a039fae95c1e2507e52ee9f40" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.958506 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"4bc9fd32-61ce-4fbb-b67e-1376102f5384","Type":"ContainerDied","Data":"e1aa5214c0457299056866d70c565c5094c4dfa52c85c6428fa756810ed8da10"} Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.958553 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1aa5214c0457299056866d70c565c5094c4dfa52c85c6428fa756810ed8da10" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.958653 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.986085 4792 scope.go:117] "RemoveContainer" containerID="d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.991900 4792 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.992340 4792 status_manager.go:851] "Failed to get status for pod" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.992618 4792 status_manager.go:851] "Failed to get status for pod" podUID="584dbcf3-9289-47c3-a556-0418b670cb21" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-58b6dc46cc-pd7g8\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.992868 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.993635 4792 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.994410 4792 status_manager.go:851] "Failed to get status for pod" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.994638 4792 status_manager.go:851] "Failed to get status for pod" podUID="584dbcf3-9289-47c3-a556-0418b670cb21" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-58b6dc46cc-pd7g8\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:21 crc kubenswrapper[4792]: I0301 09:13:21.995526 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:22 crc kubenswrapper[4792]: I0301 09:13:22.007438 4792 scope.go:117] "RemoveContainer" containerID="5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1" Mar 01 09:13:22 crc kubenswrapper[4792]: I0301 09:13:22.024084 4792 scope.go:117] "RemoveContainer" containerID="8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7" Mar 01 09:13:22 crc kubenswrapper[4792]: I0301 09:13:22.043673 4792 scope.go:117] "RemoveContainer" containerID="1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610" Mar 01 09:13:22 crc kubenswrapper[4792]: I0301 09:13:22.064945 4792 scope.go:117] "RemoveContainer" containerID="d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada" Mar 01 09:13:22 crc kubenswrapper[4792]: I0301 09:13:22.086263 4792 scope.go:117] "RemoveContainer" containerID="4c38215135d7b0d135f95361664363266a76db5a039fae95c1e2507e52ee9f40" Mar 01 09:13:22 crc kubenswrapper[4792]: E0301 09:13:22.086847 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c38215135d7b0d135f95361664363266a76db5a039fae95c1e2507e52ee9f40\": container with ID starting with 4c38215135d7b0d135f95361664363266a76db5a039fae95c1e2507e52ee9f40 not found: ID does not exist" containerID="4c38215135d7b0d135f95361664363266a76db5a039fae95c1e2507e52ee9f40" Mar 01 09:13:22 crc kubenswrapper[4792]: I0301 09:13:22.086977 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c38215135d7b0d135f95361664363266a76db5a039fae95c1e2507e52ee9f40"} err="failed to get container status \"4c38215135d7b0d135f95361664363266a76db5a039fae95c1e2507e52ee9f40\": rpc error: code = NotFound desc = could not find container \"4c38215135d7b0d135f95361664363266a76db5a039fae95c1e2507e52ee9f40\": container with ID starting with 4c38215135d7b0d135f95361664363266a76db5a039fae95c1e2507e52ee9f40 not found: ID does not exist" Mar 01 09:13:22 crc kubenswrapper[4792]: I0301 09:13:22.087067 4792 scope.go:117] "RemoveContainer" containerID="d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48" Mar 01 09:13:22 crc kubenswrapper[4792]: E0301 09:13:22.087720 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\": container with ID starting with d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48 not found: ID does not exist" containerID="d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48" Mar 01 09:13:22 crc kubenswrapper[4792]: I0301 09:13:22.087776 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48"} err="failed to get container status \"d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\": rpc error: code = NotFound desc = could not find container \"d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48\": container with ID starting with d923d3c9397ce07959d1db5372a20d94426c2a1e7cc1d35d5cc7ccc97c091b48 not found: ID does not exist" Mar 01 09:13:22 crc kubenswrapper[4792]: I0301 09:13:22.087810 4792 scope.go:117] "RemoveContainer" containerID="5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1" Mar 01 09:13:22 crc kubenswrapper[4792]: E0301 09:13:22.088202 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\": container with ID starting with 5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1 not found: ID does not exist" containerID="5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1" Mar 01 09:13:22 crc kubenswrapper[4792]: I0301 09:13:22.088290 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1"} err="failed to get container status \"5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\": rpc error: code = NotFound desc = could not find container \"5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1\": container with ID starting with 5e173c1be424415261ae991ad7ddecde3d464924594e745e10f5ba51ff5cddb1 not found: ID does not exist" Mar 01 09:13:22 crc kubenswrapper[4792]: I0301 09:13:22.088357 4792 scope.go:117] "RemoveContainer" containerID="8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7" Mar 01 09:13:22 crc kubenswrapper[4792]: E0301 09:13:22.089045 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\": container with ID starting with 8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7 not found: ID does not exist" containerID="8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7" Mar 01 09:13:22 crc kubenswrapper[4792]: I0301 09:13:22.089124 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7"} err="failed to get container status \"8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\": rpc error: code = NotFound desc = could not find container \"8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7\": container with ID starting with 8abb4b29aedf96d518ab4cf6e76c3c906de79b5ce8f6f2736ba44cafd96d17d7 not found: ID does not exist" Mar 01 09:13:22 crc kubenswrapper[4792]: I0301 09:13:22.089201 4792 scope.go:117] "RemoveContainer" containerID="1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610" Mar 01 09:13:22 crc kubenswrapper[4792]: E0301 09:13:22.089627 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\": container with ID starting with 1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610 not found: ID does not exist" containerID="1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610" Mar 01 09:13:22 crc kubenswrapper[4792]: I0301 09:13:22.089702 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610"} err="failed to get container status \"1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\": rpc error: code = NotFound desc = could not find container \"1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610\": container with ID starting with 1e8b2eaa0b358b53be6f015170b75351ae96bdc3c599f1fe088aa5aebd38c610 not found: ID does not exist" Mar 01 09:13:22 crc kubenswrapper[4792]: I0301 09:13:22.089773 4792 scope.go:117] "RemoveContainer" containerID="d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada" Mar 01 09:13:22 crc kubenswrapper[4792]: E0301 09:13:22.090169 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\": container with ID starting with d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada not found: ID does not exist" containerID="d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada" Mar 01 09:13:22 crc kubenswrapper[4792]: I0301 09:13:22.090217 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada"} err="failed to get container status \"d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\": rpc error: code = NotFound desc = could not find container \"d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada\": container with ID starting with d8e98c13ffd383fb7c9bb3ab582deb45acb9814694228163494e2dd28067eada not found: ID does not exist" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.299111 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cw675" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.300997 4792 status_manager.go:851] "Failed to get status for pod" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.301395 4792 status_manager.go:851] "Failed to get status for pod" podUID="584dbcf3-9289-47c3-a556-0418b670cb21" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-58b6dc46cc-pd7g8\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.301851 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.302611 4792 status_manager.go:851] "Failed to get status for pod" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" pod="openshift-marketplace/certified-operators-cw675" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cw675\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.303462 4792 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.360324 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cw675" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.360877 4792 status_manager.go:851] "Failed to get status for pod" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" pod="openshift-marketplace/certified-operators-cw675" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cw675\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.361383 4792 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.361760 4792 status_manager.go:851] "Failed to get status for pod" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.362140 4792 status_manager.go:851] "Failed to get status for pod" podUID="584dbcf3-9289-47c3-a556-0418b670cb21" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-58b6dc46cc-pd7g8\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.362551 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.422349 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.438757 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p6b9w" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.439572 4792 status_manager.go:851] "Failed to get status for pod" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" pod="openshift-marketplace/community-operators-p6b9w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-p6b9w\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.440138 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.440782 4792 status_manager.go:851] "Failed to get status for pod" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" pod="openshift-marketplace/certified-operators-cw675" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cw675\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.441736 4792 status_manager.go:851] "Failed to get status for pod" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.442262 4792 status_manager.go:851] "Failed to get status for pod" podUID="584dbcf3-9289-47c3-a556-0418b670cb21" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-58b6dc46cc-pd7g8\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.510594 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p6b9w" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.511406 4792 status_manager.go:851] "Failed to get status for pod" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.512012 4792 status_manager.go:851] "Failed to get status for pod" podUID="584dbcf3-9289-47c3-a556-0418b670cb21" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-58b6dc46cc-pd7g8\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.512544 4792 status_manager.go:851] "Failed to get status for pod" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" pod="openshift-marketplace/community-operators-p6b9w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-p6b9w\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.513008 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:23 crc kubenswrapper[4792]: I0301 09:13:23.513407 4792 status_manager.go:851] "Failed to get status for pod" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" pod="openshift-marketplace/certified-operators-cw675" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cw675\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:24 crc kubenswrapper[4792]: E0301 09:13:24.079127 4792 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:24 crc kubenswrapper[4792]: E0301 09:13:24.079882 4792 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:24 crc kubenswrapper[4792]: E0301 09:13:24.080378 4792 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:24 crc kubenswrapper[4792]: E0301 09:13:24.080878 4792 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:24 crc kubenswrapper[4792]: E0301 09:13:24.081340 4792 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:24 crc kubenswrapper[4792]: I0301 09:13:24.081391 4792 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 01 09:13:24 crc kubenswrapper[4792]: E0301 09:13:24.081730 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="200ms" Mar 01 09:13:24 crc kubenswrapper[4792]: E0301 09:13:24.283103 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="400ms" Mar 01 09:13:24 crc kubenswrapper[4792]: E0301 09:13:24.684033 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="800ms" Mar 01 09:13:25 crc kubenswrapper[4792]: E0301 09:13:25.485009 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="1.6s" Mar 01 09:13:26 crc kubenswrapper[4792]: I0301 09:13:26.262577 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7nwc2" Mar 01 09:13:26 crc kubenswrapper[4792]: I0301 09:13:26.263577 4792 status_manager.go:851] "Failed to get status for pod" podUID="22c7c368-3523-4224-aebd-59b29640bed0" pod="openshift-marketplace/redhat-operators-7nwc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7nwc2\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:26 crc kubenswrapper[4792]: I0301 09:13:26.264973 4792 status_manager.go:851] "Failed to get status for pod" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:26 crc kubenswrapper[4792]: I0301 09:13:26.265759 4792 status_manager.go:851] "Failed to get status for pod" podUID="584dbcf3-9289-47c3-a556-0418b670cb21" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-58b6dc46cc-pd7g8\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:26 crc kubenswrapper[4792]: I0301 09:13:26.266464 4792 status_manager.go:851] "Failed to get status for pod" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" pod="openshift-marketplace/community-operators-p6b9w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-p6b9w\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:26 crc kubenswrapper[4792]: I0301 09:13:26.266966 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:26 crc kubenswrapper[4792]: I0301 09:13:26.267607 4792 status_manager.go:851] "Failed to get status for pod" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" pod="openshift-marketplace/certified-operators-cw675" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cw675\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:26 crc kubenswrapper[4792]: I0301 09:13:26.322293 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7nwc2" Mar 01 09:13:26 crc kubenswrapper[4792]: I0301 09:13:26.323221 4792 status_manager.go:851] "Failed to get status for pod" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" pod="openshift-marketplace/community-operators-p6b9w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-p6b9w\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:26 crc kubenswrapper[4792]: I0301 09:13:26.323805 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:26 crc kubenswrapper[4792]: I0301 09:13:26.324514 4792 status_manager.go:851] "Failed to get status for pod" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" pod="openshift-marketplace/certified-operators-cw675" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cw675\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:26 crc kubenswrapper[4792]: I0301 09:13:26.324998 4792 status_manager.go:851] "Failed to get status for pod" podUID="22c7c368-3523-4224-aebd-59b29640bed0" pod="openshift-marketplace/redhat-operators-7nwc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7nwc2\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:26 crc kubenswrapper[4792]: I0301 09:13:26.325421 4792 status_manager.go:851] "Failed to get status for pod" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:26 crc kubenswrapper[4792]: I0301 09:13:26.325858 4792 status_manager.go:851] "Failed to get status for pod" podUID="584dbcf3-9289-47c3-a556-0418b670cb21" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-58b6dc46cc-pd7g8\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:27 crc kubenswrapper[4792]: E0301 09:13:27.086390 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="3.2s" Mar 01 09:13:29 crc kubenswrapper[4792]: E0301 09:13:29.459938 4792 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.89:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1898acc14bb41be2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-01 09:13:19.569050594 +0000 UTC m=+328.810929791,LastTimestamp:2026-03-01 09:13:19.569050594 +0000 UTC m=+328.810929791,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 01 09:13:30 crc kubenswrapper[4792]: E0301 09:13:30.287190 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.89:6443: connect: connection refused" interval="6.4s" Mar 01 09:13:31 crc kubenswrapper[4792]: I0301 09:13:31.408841 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:31 crc kubenswrapper[4792]: I0301 09:13:31.412170 4792 status_manager.go:851] "Failed to get status for pod" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:31 crc kubenswrapper[4792]: I0301 09:13:31.412466 4792 status_manager.go:851] "Failed to get status for pod" podUID="584dbcf3-9289-47c3-a556-0418b670cb21" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-58b6dc46cc-pd7g8\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:31 crc kubenswrapper[4792]: I0301 09:13:31.412655 4792 status_manager.go:851] "Failed to get status for pod" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" pod="openshift-marketplace/community-operators-p6b9w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-p6b9w\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:31 crc kubenswrapper[4792]: I0301 09:13:31.412791 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:31 crc kubenswrapper[4792]: I0301 09:13:31.412970 4792 status_manager.go:851] "Failed to get status for pod" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" pod="openshift-marketplace/certified-operators-cw675" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cw675\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:31 crc kubenswrapper[4792]: I0301 09:13:31.413295 4792 status_manager.go:851] "Failed to get status for pod" podUID="22c7c368-3523-4224-aebd-59b29640bed0" pod="openshift-marketplace/redhat-operators-7nwc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7nwc2\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:31 crc kubenswrapper[4792]: I0301 09:13:31.413874 4792 status_manager.go:851] "Failed to get status for pod" podUID="22c7c368-3523-4224-aebd-59b29640bed0" pod="openshift-marketplace/redhat-operators-7nwc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7nwc2\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:31 crc kubenswrapper[4792]: I0301 09:13:31.414147 4792 status_manager.go:851] "Failed to get status for pod" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:31 crc kubenswrapper[4792]: I0301 09:13:31.414499 4792 status_manager.go:851] "Failed to get status for pod" podUID="584dbcf3-9289-47c3-a556-0418b670cb21" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-58b6dc46cc-pd7g8\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:31 crc kubenswrapper[4792]: I0301 09:13:31.414880 4792 status_manager.go:851] "Failed to get status for pod" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" pod="openshift-marketplace/community-operators-p6b9w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-p6b9w\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:31 crc kubenswrapper[4792]: I0301 09:13:31.415292 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:31 crc kubenswrapper[4792]: I0301 09:13:31.415533 4792 status_manager.go:851] "Failed to get status for pod" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" pod="openshift-marketplace/certified-operators-cw675" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cw675\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:31 crc kubenswrapper[4792]: I0301 09:13:31.425075 4792 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de0393ad-517c-4b31-9bf4-ed1a3d855bc1" Mar 01 09:13:31 crc kubenswrapper[4792]: I0301 09:13:31.425108 4792 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de0393ad-517c-4b31-9bf4-ed1a3d855bc1" Mar 01 09:13:31 crc kubenswrapper[4792]: E0301 09:13:31.425536 4792 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:31 crc kubenswrapper[4792]: I0301 09:13:31.426276 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:31 crc kubenswrapper[4792]: W0301 09:13:31.462700 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-de26490347370479ac4fe71a4febf0c58330e3ef1ac9defc73a62c8cb4d077e7 WatchSource:0}: Error finding container de26490347370479ac4fe71a4febf0c58330e3ef1ac9defc73a62c8cb4d077e7: Status 404 returned error can't find the container with id de26490347370479ac4fe71a4febf0c58330e3ef1ac9defc73a62c8cb4d077e7 Mar 01 09:13:32 crc kubenswrapper[4792]: I0301 09:13:32.030211 4792 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="2c69491c8520b0c3299bb8bff8a9128dc94446caa1325badc2162e5d39d0744d" exitCode=0 Mar 01 09:13:32 crc kubenswrapper[4792]: I0301 09:13:32.030330 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"2c69491c8520b0c3299bb8bff8a9128dc94446caa1325badc2162e5d39d0744d"} Mar 01 09:13:32 crc kubenswrapper[4792]: I0301 09:13:32.030697 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"de26490347370479ac4fe71a4febf0c58330e3ef1ac9defc73a62c8cb4d077e7"} Mar 01 09:13:32 crc kubenswrapper[4792]: I0301 09:13:32.031275 4792 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de0393ad-517c-4b31-9bf4-ed1a3d855bc1" Mar 01 09:13:32 crc kubenswrapper[4792]: I0301 09:13:32.031316 4792 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de0393ad-517c-4b31-9bf4-ed1a3d855bc1" Mar 01 09:13:32 crc kubenswrapper[4792]: I0301 09:13:32.031783 4792 status_manager.go:851] "Failed to get status for pod" podUID="22c7c368-3523-4224-aebd-59b29640bed0" pod="openshift-marketplace/redhat-operators-7nwc2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7nwc2\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:32 crc kubenswrapper[4792]: E0301 09:13:32.032195 4792 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:32 crc kubenswrapper[4792]: I0301 09:13:32.032420 4792 status_manager.go:851] "Failed to get status for pod" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:32 crc kubenswrapper[4792]: I0301 09:13:32.032894 4792 status_manager.go:851] "Failed to get status for pod" podUID="584dbcf3-9289-47c3-a556-0418b670cb21" pod="openshift-authentication/oauth-openshift-58b6dc46cc-pd7g8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-58b6dc46cc-pd7g8\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:32 crc kubenswrapper[4792]: I0301 09:13:32.033288 4792 status_manager.go:851] "Failed to get status for pod" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" pod="openshift-marketplace/community-operators-p6b9w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-p6b9w\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:32 crc kubenswrapper[4792]: I0301 09:13:32.034064 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:32 crc kubenswrapper[4792]: I0301 09:13:32.034542 4792 status_manager.go:851] "Failed to get status for pod" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" pod="openshift-marketplace/certified-operators-cw675" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cw675\": dial tcp 38.102.83.89:6443: connect: connection refused" Mar 01 09:13:33 crc kubenswrapper[4792]: I0301 09:13:33.038847 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c6a36614beeb413ffb051ac3aaa16c27304eae223b3d69894745febf80f64bc1"} Mar 01 09:13:33 crc kubenswrapper[4792]: I0301 09:13:33.039267 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6aaac00c2bd57a2c892468be6a66b4edf964fad95d25779e62f9c9fb81968827"} Mar 01 09:13:33 crc kubenswrapper[4792]: I0301 09:13:33.039277 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ed0425c88611496504d23e2b0e98365fe19146df38e3248933a2747a1cfd6ffb"} Mar 01 09:13:34 crc kubenswrapper[4792]: I0301 09:13:34.048123 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"185916b4b4557494e9003d7fe65c3f35fd6e3b5ef2c507151a5435e3f373b9a0"} Mar 01 09:13:34 crc kubenswrapper[4792]: I0301 09:13:34.048163 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"aacad5073f12ad0c4d39fbff51590bbf75cc36a18c5050288d92bbb137e5415d"} Mar 01 09:13:34 crc kubenswrapper[4792]: I0301 09:13:34.048370 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:34 crc kubenswrapper[4792]: I0301 09:13:34.048585 4792 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de0393ad-517c-4b31-9bf4-ed1a3d855bc1" Mar 01 09:13:34 crc kubenswrapper[4792]: I0301 09:13:34.048619 4792 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de0393ad-517c-4b31-9bf4-ed1a3d855bc1" Mar 01 09:13:35 crc kubenswrapper[4792]: I0301 09:13:35.057183 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 01 09:13:35 crc kubenswrapper[4792]: I0301 09:13:35.059267 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 01 09:13:35 crc kubenswrapper[4792]: I0301 09:13:35.059341 4792 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="859afd52205b95f45333b5975e67308ee360d3887aa679a922b7cba5e3acda4c" exitCode=1 Mar 01 09:13:35 crc kubenswrapper[4792]: I0301 09:13:35.059391 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"859afd52205b95f45333b5975e67308ee360d3887aa679a922b7cba5e3acda4c"} Mar 01 09:13:35 crc kubenswrapper[4792]: I0301 09:13:35.060290 4792 scope.go:117] "RemoveContainer" containerID="859afd52205b95f45333b5975e67308ee360d3887aa679a922b7cba5e3acda4c" Mar 01 09:13:36 crc kubenswrapper[4792]: I0301 09:13:36.066377 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 01 09:13:36 crc kubenswrapper[4792]: I0301 09:13:36.067987 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 01 09:13:36 crc kubenswrapper[4792]: I0301 09:13:36.068042 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"96c3960038c0101f16ce250d82c61191002c8a6e70c05569b64ed4476b4203ef"} Mar 01 09:13:36 crc kubenswrapper[4792]: I0301 09:13:36.426424 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:36 crc kubenswrapper[4792]: I0301 09:13:36.426748 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:36 crc kubenswrapper[4792]: I0301 09:13:36.434375 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:39 crc kubenswrapper[4792]: I0301 09:13:39.064810 4792 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:39 crc kubenswrapper[4792]: I0301 09:13:39.082039 4792 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de0393ad-517c-4b31-9bf4-ed1a3d855bc1" Mar 01 09:13:39 crc kubenswrapper[4792]: I0301 09:13:39.082238 4792 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de0393ad-517c-4b31-9bf4-ed1a3d855bc1" Mar 01 09:13:39 crc kubenswrapper[4792]: I0301 09:13:39.085453 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:40 crc kubenswrapper[4792]: I0301 09:13:40.086929 4792 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de0393ad-517c-4b31-9bf4-ed1a3d855bc1" Mar 01 09:13:40 crc kubenswrapper[4792]: I0301 09:13:40.087255 4792 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de0393ad-517c-4b31-9bf4-ed1a3d855bc1" Mar 01 09:13:40 crc kubenswrapper[4792]: I0301 09:13:40.115593 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:13:40 crc kubenswrapper[4792]: I0301 09:13:40.119225 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:13:41 crc kubenswrapper[4792]: I0301 09:13:41.006954 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:13:41 crc kubenswrapper[4792]: I0301 09:13:41.429112 4792 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="f1fe19d9-d509-4345-a79f-7d4bad570cf9" Mar 01 09:13:49 crc kubenswrapper[4792]: I0301 09:13:49.664557 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 01 09:13:50 crc kubenswrapper[4792]: I0301 09:13:50.176700 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 01 09:13:50 crc kubenswrapper[4792]: I0301 09:13:50.392394 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 01 09:13:50 crc kubenswrapper[4792]: I0301 09:13:50.702944 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 01 09:13:50 crc kubenswrapper[4792]: I0301 09:13:50.827379 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 01 09:13:50 crc kubenswrapper[4792]: I0301 09:13:50.828797 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 01 09:13:50 crc kubenswrapper[4792]: I0301 09:13:50.873756 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 01 09:13:51 crc kubenswrapper[4792]: I0301 09:13:51.009979 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 01 09:13:51 crc kubenswrapper[4792]: I0301 09:13:51.102838 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 01 09:13:51 crc kubenswrapper[4792]: I0301 09:13:51.343524 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 01 09:13:51 crc kubenswrapper[4792]: I0301 09:13:51.358578 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 01 09:13:51 crc kubenswrapper[4792]: I0301 09:13:51.441543 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 01 09:13:51 crc kubenswrapper[4792]: I0301 09:13:51.560390 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 01 09:13:51 crc kubenswrapper[4792]: I0301 09:13:51.640325 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 01 09:13:51 crc kubenswrapper[4792]: I0301 09:13:51.661838 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 01 09:13:51 crc kubenswrapper[4792]: I0301 09:13:51.667210 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 01 09:13:51 crc kubenswrapper[4792]: I0301 09:13:51.684632 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 01 09:13:51 crc kubenswrapper[4792]: I0301 09:13:51.715938 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 01 09:13:51 crc kubenswrapper[4792]: I0301 09:13:51.847459 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 01 09:13:51 crc kubenswrapper[4792]: I0301 09:13:51.867779 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.003113 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.037023 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.113124 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.163625 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.331090 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.587665 4792 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.588269 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=33.588250879 podStartE2EDuration="33.588250879s" podCreationTimestamp="2026-03-01 09:13:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:13:39.142058798 +0000 UTC m=+348.383937995" watchObservedRunningTime="2026-03-01 09:13:52.588250879 +0000 UTC m=+361.830130116" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.595310 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.595374 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k","openshift-kube-apiserver/kube-apiserver-crc","openshift-controller-manager/controller-manager-55c479c949-bxrgx"] Mar 01 09:13:52 crc kubenswrapper[4792]: E0301 09:13:52.595643 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" containerName="installer" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.595673 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" containerName="installer" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.595847 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bc9fd32-61ce-4fbb-b67e-1376102f5384" containerName="installer" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.596649 4792 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de0393ad-517c-4b31-9bf4-ed1a3d855bc1" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.596856 4792 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de0393ad-517c-4b31-9bf4-ed1a3d855bc1" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.597242 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.597288 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.601807 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.602099 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.603801 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.604110 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.604316 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.604488 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.604738 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.604883 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.605012 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.605082 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.605285 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.605399 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.605843 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8vjz\" (UniqueName: \"kubernetes.io/projected/f9ec3613-4fe1-4e71-8991-f5be9a94579e-kube-api-access-g8vjz\") pod \"route-controller-manager-f66f7bb8d-7bp5k\" (UID: \"f9ec3613-4fe1-4e71-8991-f5be9a94579e\") " pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.605900 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqzkw\" (UniqueName: \"kubernetes.io/projected/059440a1-ff60-496f-bdbc-8218b5ceb3f7-kube-api-access-zqzkw\") pod \"controller-manager-55c479c949-bxrgx\" (UID: \"059440a1-ff60-496f-bdbc-8218b5ceb3f7\") " pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.605983 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/059440a1-ff60-496f-bdbc-8218b5ceb3f7-client-ca\") pod \"controller-manager-55c479c949-bxrgx\" (UID: \"059440a1-ff60-496f-bdbc-8218b5ceb3f7\") " pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.606033 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/059440a1-ff60-496f-bdbc-8218b5ceb3f7-config\") pod \"controller-manager-55c479c949-bxrgx\" (UID: \"059440a1-ff60-496f-bdbc-8218b5ceb3f7\") " pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.606107 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/059440a1-ff60-496f-bdbc-8218b5ceb3f7-serving-cert\") pod \"controller-manager-55c479c949-bxrgx\" (UID: \"059440a1-ff60-496f-bdbc-8218b5ceb3f7\") " pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.606156 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9ec3613-4fe1-4e71-8991-f5be9a94579e-serving-cert\") pod \"route-controller-manager-f66f7bb8d-7bp5k\" (UID: \"f9ec3613-4fe1-4e71-8991-f5be9a94579e\") " pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.606260 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f9ec3613-4fe1-4e71-8991-f5be9a94579e-client-ca\") pod \"route-controller-manager-f66f7bb8d-7bp5k\" (UID: \"f9ec3613-4fe1-4e71-8991-f5be9a94579e\") " pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.606300 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ec3613-4fe1-4e71-8991-f5be9a94579e-config\") pod \"route-controller-manager-f66f7bb8d-7bp5k\" (UID: \"f9ec3613-4fe1-4e71-8991-f5be9a94579e\") " pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.606327 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/059440a1-ff60-496f-bdbc-8218b5ceb3f7-proxy-ca-bundles\") pod \"controller-manager-55c479c949-bxrgx\" (UID: \"059440a1-ff60-496f-bdbc-8218b5ceb3f7\") " pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.606939 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.613571 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.626587 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=13.626555057000001 podStartE2EDuration="13.626555057s" podCreationTimestamp="2026-03-01 09:13:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:13:52.624984139 +0000 UTC m=+361.866863376" watchObservedRunningTime="2026-03-01 09:13:52.626555057 +0000 UTC m=+361.868434304" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.630326 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.692293 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.707370 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/059440a1-ff60-496f-bdbc-8218b5ceb3f7-serving-cert\") pod \"controller-manager-55c479c949-bxrgx\" (UID: \"059440a1-ff60-496f-bdbc-8218b5ceb3f7\") " pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.707417 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9ec3613-4fe1-4e71-8991-f5be9a94579e-serving-cert\") pod \"route-controller-manager-f66f7bb8d-7bp5k\" (UID: \"f9ec3613-4fe1-4e71-8991-f5be9a94579e\") " pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.707462 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f9ec3613-4fe1-4e71-8991-f5be9a94579e-client-ca\") pod \"route-controller-manager-f66f7bb8d-7bp5k\" (UID: \"f9ec3613-4fe1-4e71-8991-f5be9a94579e\") " pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.707482 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/059440a1-ff60-496f-bdbc-8218b5ceb3f7-proxy-ca-bundles\") pod \"controller-manager-55c479c949-bxrgx\" (UID: \"059440a1-ff60-496f-bdbc-8218b5ceb3f7\") " pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.707499 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ec3613-4fe1-4e71-8991-f5be9a94579e-config\") pod \"route-controller-manager-f66f7bb8d-7bp5k\" (UID: \"f9ec3613-4fe1-4e71-8991-f5be9a94579e\") " pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.707522 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8vjz\" (UniqueName: \"kubernetes.io/projected/f9ec3613-4fe1-4e71-8991-f5be9a94579e-kube-api-access-g8vjz\") pod \"route-controller-manager-f66f7bb8d-7bp5k\" (UID: \"f9ec3613-4fe1-4e71-8991-f5be9a94579e\") " pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.707541 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqzkw\" (UniqueName: \"kubernetes.io/projected/059440a1-ff60-496f-bdbc-8218b5ceb3f7-kube-api-access-zqzkw\") pod \"controller-manager-55c479c949-bxrgx\" (UID: \"059440a1-ff60-496f-bdbc-8218b5ceb3f7\") " pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.707566 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/059440a1-ff60-496f-bdbc-8218b5ceb3f7-client-ca\") pod \"controller-manager-55c479c949-bxrgx\" (UID: \"059440a1-ff60-496f-bdbc-8218b5ceb3f7\") " pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.707586 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/059440a1-ff60-496f-bdbc-8218b5ceb3f7-config\") pod \"controller-manager-55c479c949-bxrgx\" (UID: \"059440a1-ff60-496f-bdbc-8218b5ceb3f7\") " pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.709200 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/059440a1-ff60-496f-bdbc-8218b5ceb3f7-config\") pod \"controller-manager-55c479c949-bxrgx\" (UID: \"059440a1-ff60-496f-bdbc-8218b5ceb3f7\") " pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.709408 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ec3613-4fe1-4e71-8991-f5be9a94579e-config\") pod \"route-controller-manager-f66f7bb8d-7bp5k\" (UID: \"f9ec3613-4fe1-4e71-8991-f5be9a94579e\") " pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.710102 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f9ec3613-4fe1-4e71-8991-f5be9a94579e-client-ca\") pod \"route-controller-manager-f66f7bb8d-7bp5k\" (UID: \"f9ec3613-4fe1-4e71-8991-f5be9a94579e\") " pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.710533 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/059440a1-ff60-496f-bdbc-8218b5ceb3f7-client-ca\") pod \"controller-manager-55c479c949-bxrgx\" (UID: \"059440a1-ff60-496f-bdbc-8218b5ceb3f7\") " pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.710977 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/059440a1-ff60-496f-bdbc-8218b5ceb3f7-proxy-ca-bundles\") pod \"controller-manager-55c479c949-bxrgx\" (UID: \"059440a1-ff60-496f-bdbc-8218b5ceb3f7\") " pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.714613 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9ec3613-4fe1-4e71-8991-f5be9a94579e-serving-cert\") pod \"route-controller-manager-f66f7bb8d-7bp5k\" (UID: \"f9ec3613-4fe1-4e71-8991-f5be9a94579e\") " pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.715799 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/059440a1-ff60-496f-bdbc-8218b5ceb3f7-serving-cert\") pod \"controller-manager-55c479c949-bxrgx\" (UID: \"059440a1-ff60-496f-bdbc-8218b5ceb3f7\") " pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.772660 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqzkw\" (UniqueName: \"kubernetes.io/projected/059440a1-ff60-496f-bdbc-8218b5ceb3f7-kube-api-access-zqzkw\") pod \"controller-manager-55c479c949-bxrgx\" (UID: \"059440a1-ff60-496f-bdbc-8218b5ceb3f7\") " pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.772877 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8vjz\" (UniqueName: \"kubernetes.io/projected/f9ec3613-4fe1-4e71-8991-f5be9a94579e-kube-api-access-g8vjz\") pod \"route-controller-manager-f66f7bb8d-7bp5k\" (UID: \"f9ec3613-4fe1-4e71-8991-f5be9a94579e\") " pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.801640 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.890091 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.940327 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:13:52 crc kubenswrapper[4792]: I0301 09:13:52.957636 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:13:53 crc kubenswrapper[4792]: I0301 09:13:53.094426 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 01 09:13:53 crc kubenswrapper[4792]: I0301 09:13:53.135658 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 01 09:13:53 crc kubenswrapper[4792]: I0301 09:13:53.225525 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 01 09:13:53 crc kubenswrapper[4792]: I0301 09:13:53.353882 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 01 09:13:53 crc kubenswrapper[4792]: I0301 09:13:53.359225 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 01 09:13:53 crc kubenswrapper[4792]: I0301 09:13:53.417811 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 01 09:13:53 crc kubenswrapper[4792]: I0301 09:13:53.516433 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 01 09:13:53 crc kubenswrapper[4792]: I0301 09:13:53.537113 4792 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 01 09:13:53 crc kubenswrapper[4792]: I0301 09:13:53.551475 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 01 09:13:53 crc kubenswrapper[4792]: I0301 09:13:53.576648 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 01 09:13:53 crc kubenswrapper[4792]: I0301 09:13:53.576845 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 01 09:13:53 crc kubenswrapper[4792]: I0301 09:13:53.656126 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 01 09:13:53 crc kubenswrapper[4792]: I0301 09:13:53.666331 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 01 09:13:53 crc kubenswrapper[4792]: I0301 09:13:53.719188 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 01 09:13:53 crc kubenswrapper[4792]: I0301 09:13:53.881471 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 01 09:13:53 crc kubenswrapper[4792]: I0301 09:13:53.882892 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 01 09:13:53 crc kubenswrapper[4792]: I0301 09:13:53.953220 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 01 09:13:53 crc kubenswrapper[4792]: I0301 09:13:53.995518 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 01 09:13:54 crc kubenswrapper[4792]: I0301 09:13:54.103188 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 01 09:13:54 crc kubenswrapper[4792]: I0301 09:13:54.119176 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 01 09:13:54 crc kubenswrapper[4792]: I0301 09:13:54.143226 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 01 09:13:54 crc kubenswrapper[4792]: I0301 09:13:54.230996 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 01 09:13:54 crc kubenswrapper[4792]: I0301 09:13:54.252985 4792 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 01 09:13:54 crc kubenswrapper[4792]: I0301 09:13:54.476069 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 01 09:13:54 crc kubenswrapper[4792]: I0301 09:13:54.545610 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 01 09:13:54 crc kubenswrapper[4792]: I0301 09:13:54.573395 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 01 09:13:54 crc kubenswrapper[4792]: I0301 09:13:54.637359 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 01 09:13:54 crc kubenswrapper[4792]: I0301 09:13:54.836870 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 01 09:13:54 crc kubenswrapper[4792]: I0301 09:13:54.847590 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 01 09:13:54 crc kubenswrapper[4792]: I0301 09:13:54.873965 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 01 09:13:54 crc kubenswrapper[4792]: I0301 09:13:54.913197 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 01 09:13:55 crc kubenswrapper[4792]: I0301 09:13:55.062091 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 01 09:13:55 crc kubenswrapper[4792]: I0301 09:13:55.145213 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 01 09:13:55 crc kubenswrapper[4792]: I0301 09:13:55.158516 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 01 09:13:55 crc kubenswrapper[4792]: I0301 09:13:55.345586 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 01 09:13:55 crc kubenswrapper[4792]: I0301 09:13:55.554536 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 01 09:13:55 crc kubenswrapper[4792]: I0301 09:13:55.555592 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 01 09:13:55 crc kubenswrapper[4792]: I0301 09:13:55.707421 4792 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 01 09:13:55 crc kubenswrapper[4792]: I0301 09:13:55.961759 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.021462 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.051238 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.089418 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.105509 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.114535 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.135940 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.159482 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.172461 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.236195 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.277246 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.415958 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.417334 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.477578 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.521440 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.544465 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.577663 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.633955 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.636883 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.653309 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.664271 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.764811 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.887400 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.928210 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.955785 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 01 09:13:56 crc kubenswrapper[4792]: I0301 09:13:56.985946 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 01 09:13:57 crc kubenswrapper[4792]: I0301 09:13:57.070843 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 01 09:13:57 crc kubenswrapper[4792]: I0301 09:13:57.079125 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 01 09:13:57 crc kubenswrapper[4792]: I0301 09:13:57.113376 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 01 09:13:57 crc kubenswrapper[4792]: I0301 09:13:57.187359 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 01 09:13:57 crc kubenswrapper[4792]: I0301 09:13:57.284747 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 01 09:13:57 crc kubenswrapper[4792]: I0301 09:13:57.309051 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 01 09:13:57 crc kubenswrapper[4792]: I0301 09:13:57.388549 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 01 09:13:57 crc kubenswrapper[4792]: I0301 09:13:57.448439 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 01 09:13:57 crc kubenswrapper[4792]: I0301 09:13:57.459408 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 01 09:13:57 crc kubenswrapper[4792]: I0301 09:13:57.554362 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 01 09:13:57 crc kubenswrapper[4792]: I0301 09:13:57.596395 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 01 09:13:57 crc kubenswrapper[4792]: I0301 09:13:57.602006 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 01 09:13:57 crc kubenswrapper[4792]: I0301 09:13:57.613238 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 01 09:13:57 crc kubenswrapper[4792]: I0301 09:13:57.814083 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 01 09:13:57 crc kubenswrapper[4792]: I0301 09:13:57.815882 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 01 09:13:57 crc kubenswrapper[4792]: I0301 09:13:57.968071 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 01 09:13:57 crc kubenswrapper[4792]: I0301 09:13:57.980265 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 01 09:13:57 crc kubenswrapper[4792]: I0301 09:13:57.982876 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.011489 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.029764 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.204369 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.257075 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.307348 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.348306 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.416737 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.431484 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.517748 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.519344 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.543267 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.566507 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.600693 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.602473 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.630043 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.773001 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.817265 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.827113 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.833180 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 01 09:13:58 crc kubenswrapper[4792]: I0301 09:13:58.880920 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.060992 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.084269 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.124150 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.142840 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.148628 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.158571 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.158948 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.226411 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.356198 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.377674 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.390191 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.427660 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.473673 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.481790 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.518684 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.524461 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.531250 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.582712 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.786673 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.854181 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.911125 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 01 09:13:59 crc kubenswrapper[4792]: I0301 09:13:59.948335 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.055813 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.123154 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.150175 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.157682 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539274-zckv2"] Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.158270 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539274-zckv2" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.160041 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.160676 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.161481 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.253774 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.303381 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.308034 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnb28\" (UniqueName: \"kubernetes.io/projected/81a3bf03-822b-4b69-93a3-b420d8f58efd-kube-api-access-gnb28\") pod \"auto-csr-approver-29539274-zckv2\" (UID: \"81a3bf03-822b-4b69-93a3-b420d8f58efd\") " pod="openshift-infra/auto-csr-approver-29539274-zckv2" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.319999 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.378377 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.409007 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnb28\" (UniqueName: \"kubernetes.io/projected/81a3bf03-822b-4b69-93a3-b420d8f58efd-kube-api-access-gnb28\") pod \"auto-csr-approver-29539274-zckv2\" (UID: \"81a3bf03-822b-4b69-93a3-b420d8f58efd\") " pod="openshift-infra/auto-csr-approver-29539274-zckv2" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.423815 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.424083 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.428705 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnb28\" (UniqueName: \"kubernetes.io/projected/81a3bf03-822b-4b69-93a3-b420d8f58efd-kube-api-access-gnb28\") pod \"auto-csr-approver-29539274-zckv2\" (UID: \"81a3bf03-822b-4b69-93a3-b420d8f58efd\") " pod="openshift-infra/auto-csr-approver-29539274-zckv2" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.430147 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.475528 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539274-zckv2" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.477939 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.495718 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.510744 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.585482 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.690298 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.741857 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.772038 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.801111 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.870376 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.880680 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.969325 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.983008 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 01 09:14:00 crc kubenswrapper[4792]: I0301 09:14:00.996557 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.026395 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.033290 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.049136 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.054079 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.278645 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.284832 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.303756 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.311804 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.321569 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.368834 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.378566 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.393784 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.422358 4792 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.422750 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://0053a8c0b15b85cc2122bda95d2798eb0ba27a1e086aa18527efd0c4e0caccae" gracePeriod=5 Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.509550 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.512018 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.663317 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.727022 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.871407 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.888110 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.931871 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.933734 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.952634 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 01 09:14:01 crc kubenswrapper[4792]: I0301 09:14:01.959981 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 01 09:14:02 crc kubenswrapper[4792]: I0301 09:14:02.121350 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 01 09:14:02 crc kubenswrapper[4792]: I0301 09:14:02.159821 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 01 09:14:02 crc kubenswrapper[4792]: I0301 09:14:02.276806 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-55c479c949-bxrgx"] Mar 01 09:14:02 crc kubenswrapper[4792]: I0301 09:14:02.280506 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539274-zckv2"] Mar 01 09:14:02 crc kubenswrapper[4792]: I0301 09:14:02.298290 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k"] Mar 01 09:14:02 crc kubenswrapper[4792]: I0301 09:14:02.530395 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 01 09:14:02 crc kubenswrapper[4792]: I0301 09:14:02.550989 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 01 09:14:02 crc kubenswrapper[4792]: I0301 09:14:02.583990 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 01 09:14:02 crc kubenswrapper[4792]: I0301 09:14:02.715627 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 01 09:14:02 crc kubenswrapper[4792]: I0301 09:14:02.739648 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 01 09:14:02 crc kubenswrapper[4792]: I0301 09:14:02.752404 4792 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 01 09:14:02 crc kubenswrapper[4792]: I0301 09:14:02.824828 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 01 09:14:02 crc kubenswrapper[4792]: I0301 09:14:02.824829 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 01 09:14:02 crc kubenswrapper[4792]: I0301 09:14:02.872435 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 01 09:14:02 crc kubenswrapper[4792]: I0301 09:14:02.902259 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 01 09:14:02 crc kubenswrapper[4792]: I0301 09:14:02.902851 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 01 09:14:03 crc kubenswrapper[4792]: E0301 09:14:03.005209 4792 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 01 09:14:03 crc kubenswrapper[4792]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-f66f7bb8d-7bp5k_openshift-route-controller-manager_f9ec3613-4fe1-4e71-8991-f5be9a94579e_0(1e2720b79b9d762674d7a7d3843e4f18329f00c615ae105fe7a340ffdc9a654f): error adding pod openshift-route-controller-manager_route-controller-manager-f66f7bb8d-7bp5k to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"1e2720b79b9d762674d7a7d3843e4f18329f00c615ae105fe7a340ffdc9a654f" Netns:"/var/run/netns/c5b3c562-e98d-43f3-9947-f2b0b86c571d" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-f66f7bb8d-7bp5k;K8S_POD_INFRA_CONTAINER_ID=1e2720b79b9d762674d7a7d3843e4f18329f00c615ae105fe7a340ffdc9a654f;K8S_POD_UID=f9ec3613-4fe1-4e71-8991-f5be9a94579e" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k] networking: Multus: [openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k/f9ec3613-4fe1-4e71-8991-f5be9a94579e]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod route-controller-manager-f66f7bb8d-7bp5k in out of cluster comm: pod "route-controller-manager-f66f7bb8d-7bp5k" not found Mar 01 09:14:03 crc kubenswrapper[4792]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 01 09:14:03 crc kubenswrapper[4792]: > Mar 01 09:14:03 crc kubenswrapper[4792]: E0301 09:14:03.005295 4792 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 01 09:14:03 crc kubenswrapper[4792]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-f66f7bb8d-7bp5k_openshift-route-controller-manager_f9ec3613-4fe1-4e71-8991-f5be9a94579e_0(1e2720b79b9d762674d7a7d3843e4f18329f00c615ae105fe7a340ffdc9a654f): error adding pod openshift-route-controller-manager_route-controller-manager-f66f7bb8d-7bp5k to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"1e2720b79b9d762674d7a7d3843e4f18329f00c615ae105fe7a340ffdc9a654f" Netns:"/var/run/netns/c5b3c562-e98d-43f3-9947-f2b0b86c571d" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-f66f7bb8d-7bp5k;K8S_POD_INFRA_CONTAINER_ID=1e2720b79b9d762674d7a7d3843e4f18329f00c615ae105fe7a340ffdc9a654f;K8S_POD_UID=f9ec3613-4fe1-4e71-8991-f5be9a94579e" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k] networking: Multus: [openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k/f9ec3613-4fe1-4e71-8991-f5be9a94579e]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod route-controller-manager-f66f7bb8d-7bp5k in out of cluster comm: pod "route-controller-manager-f66f7bb8d-7bp5k" not found Mar 01 09:14:03 crc kubenswrapper[4792]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 01 09:14:03 crc kubenswrapper[4792]: > pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:14:03 crc kubenswrapper[4792]: E0301 09:14:03.005334 4792 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 01 09:14:03 crc kubenswrapper[4792]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-f66f7bb8d-7bp5k_openshift-route-controller-manager_f9ec3613-4fe1-4e71-8991-f5be9a94579e_0(1e2720b79b9d762674d7a7d3843e4f18329f00c615ae105fe7a340ffdc9a654f): error adding pod openshift-route-controller-manager_route-controller-manager-f66f7bb8d-7bp5k to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"1e2720b79b9d762674d7a7d3843e4f18329f00c615ae105fe7a340ffdc9a654f" Netns:"/var/run/netns/c5b3c562-e98d-43f3-9947-f2b0b86c571d" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-f66f7bb8d-7bp5k;K8S_POD_INFRA_CONTAINER_ID=1e2720b79b9d762674d7a7d3843e4f18329f00c615ae105fe7a340ffdc9a654f;K8S_POD_UID=f9ec3613-4fe1-4e71-8991-f5be9a94579e" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k] networking: Multus: [openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k/f9ec3613-4fe1-4e71-8991-f5be9a94579e]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod route-controller-manager-f66f7bb8d-7bp5k in out of cluster comm: pod "route-controller-manager-f66f7bb8d-7bp5k" not found Mar 01 09:14:03 crc kubenswrapper[4792]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 01 09:14:03 crc kubenswrapper[4792]: > pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:14:03 crc kubenswrapper[4792]: E0301 09:14:03.005413 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"route-controller-manager-f66f7bb8d-7bp5k_openshift-route-controller-manager(f9ec3613-4fe1-4e71-8991-f5be9a94579e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"route-controller-manager-f66f7bb8d-7bp5k_openshift-route-controller-manager(f9ec3613-4fe1-4e71-8991-f5be9a94579e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-f66f7bb8d-7bp5k_openshift-route-controller-manager_f9ec3613-4fe1-4e71-8991-f5be9a94579e_0(1e2720b79b9d762674d7a7d3843e4f18329f00c615ae105fe7a340ffdc9a654f): error adding pod openshift-route-controller-manager_route-controller-manager-f66f7bb8d-7bp5k to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"1e2720b79b9d762674d7a7d3843e4f18329f00c615ae105fe7a340ffdc9a654f\\\" Netns:\\\"/var/run/netns/c5b3c562-e98d-43f3-9947-f2b0b86c571d\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-f66f7bb8d-7bp5k;K8S_POD_INFRA_CONTAINER_ID=1e2720b79b9d762674d7a7d3843e4f18329f00c615ae105fe7a340ffdc9a654f;K8S_POD_UID=f9ec3613-4fe1-4e71-8991-f5be9a94579e\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k] networking: Multus: [openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k/f9ec3613-4fe1-4e71-8991-f5be9a94579e]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod route-controller-manager-f66f7bb8d-7bp5k in out of cluster comm: pod \\\"route-controller-manager-f66f7bb8d-7bp5k\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" podUID="f9ec3613-4fe1-4e71-8991-f5be9a94579e" Mar 01 09:14:03 crc kubenswrapper[4792]: E0301 09:14:03.016167 4792 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 01 09:14:03 crc kubenswrapper[4792]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-55c479c949-bxrgx_openshift-controller-manager_059440a1-ff60-496f-bdbc-8218b5ceb3f7_0(d1012c3ba2d9ba9c869f80eccdd22940410646d41d24b82510d46f497988cbef): error adding pod openshift-controller-manager_controller-manager-55c479c949-bxrgx to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"d1012c3ba2d9ba9c869f80eccdd22940410646d41d24b82510d46f497988cbef" Netns:"/var/run/netns/b77ff266-85d5-40ff-bdf8-eae839742e42" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-55c479c949-bxrgx;K8S_POD_INFRA_CONTAINER_ID=d1012c3ba2d9ba9c869f80eccdd22940410646d41d24b82510d46f497988cbef;K8S_POD_UID=059440a1-ff60-496f-bdbc-8218b5ceb3f7" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-55c479c949-bxrgx] networking: Multus: [openshift-controller-manager/controller-manager-55c479c949-bxrgx/059440a1-ff60-496f-bdbc-8218b5ceb3f7]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod controller-manager-55c479c949-bxrgx in out of cluster comm: pod "controller-manager-55c479c949-bxrgx" not found Mar 01 09:14:03 crc kubenswrapper[4792]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 01 09:14:03 crc kubenswrapper[4792]: > Mar 01 09:14:03 crc kubenswrapper[4792]: E0301 09:14:03.016252 4792 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 01 09:14:03 crc kubenswrapper[4792]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-55c479c949-bxrgx_openshift-controller-manager_059440a1-ff60-496f-bdbc-8218b5ceb3f7_0(d1012c3ba2d9ba9c869f80eccdd22940410646d41d24b82510d46f497988cbef): error adding pod openshift-controller-manager_controller-manager-55c479c949-bxrgx to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"d1012c3ba2d9ba9c869f80eccdd22940410646d41d24b82510d46f497988cbef" Netns:"/var/run/netns/b77ff266-85d5-40ff-bdf8-eae839742e42" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-55c479c949-bxrgx;K8S_POD_INFRA_CONTAINER_ID=d1012c3ba2d9ba9c869f80eccdd22940410646d41d24b82510d46f497988cbef;K8S_POD_UID=059440a1-ff60-496f-bdbc-8218b5ceb3f7" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-55c479c949-bxrgx] networking: Multus: [openshift-controller-manager/controller-manager-55c479c949-bxrgx/059440a1-ff60-496f-bdbc-8218b5ceb3f7]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod controller-manager-55c479c949-bxrgx in out of cluster comm: pod "controller-manager-55c479c949-bxrgx" not found Mar 01 09:14:03 crc kubenswrapper[4792]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 01 09:14:03 crc kubenswrapper[4792]: > pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:14:03 crc kubenswrapper[4792]: E0301 09:14:03.016274 4792 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 01 09:14:03 crc kubenswrapper[4792]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-55c479c949-bxrgx_openshift-controller-manager_059440a1-ff60-496f-bdbc-8218b5ceb3f7_0(d1012c3ba2d9ba9c869f80eccdd22940410646d41d24b82510d46f497988cbef): error adding pod openshift-controller-manager_controller-manager-55c479c949-bxrgx to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"d1012c3ba2d9ba9c869f80eccdd22940410646d41d24b82510d46f497988cbef" Netns:"/var/run/netns/b77ff266-85d5-40ff-bdf8-eae839742e42" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-55c479c949-bxrgx;K8S_POD_INFRA_CONTAINER_ID=d1012c3ba2d9ba9c869f80eccdd22940410646d41d24b82510d46f497988cbef;K8S_POD_UID=059440a1-ff60-496f-bdbc-8218b5ceb3f7" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-55c479c949-bxrgx] networking: Multus: [openshift-controller-manager/controller-manager-55c479c949-bxrgx/059440a1-ff60-496f-bdbc-8218b5ceb3f7]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod controller-manager-55c479c949-bxrgx in out of cluster comm: pod "controller-manager-55c479c949-bxrgx" not found Mar 01 09:14:03 crc kubenswrapper[4792]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 01 09:14:03 crc kubenswrapper[4792]: > pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:14:03 crc kubenswrapper[4792]: E0301 09:14:03.016332 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"controller-manager-55c479c949-bxrgx_openshift-controller-manager(059440a1-ff60-496f-bdbc-8218b5ceb3f7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"controller-manager-55c479c949-bxrgx_openshift-controller-manager(059440a1-ff60-496f-bdbc-8218b5ceb3f7)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-55c479c949-bxrgx_openshift-controller-manager_059440a1-ff60-496f-bdbc-8218b5ceb3f7_0(d1012c3ba2d9ba9c869f80eccdd22940410646d41d24b82510d46f497988cbef): error adding pod openshift-controller-manager_controller-manager-55c479c949-bxrgx to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"d1012c3ba2d9ba9c869f80eccdd22940410646d41d24b82510d46f497988cbef\\\" Netns:\\\"/var/run/netns/b77ff266-85d5-40ff-bdf8-eae839742e42\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-55c479c949-bxrgx;K8S_POD_INFRA_CONTAINER_ID=d1012c3ba2d9ba9c869f80eccdd22940410646d41d24b82510d46f497988cbef;K8S_POD_UID=059440a1-ff60-496f-bdbc-8218b5ceb3f7\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-55c479c949-bxrgx] networking: Multus: [openshift-controller-manager/controller-manager-55c479c949-bxrgx/059440a1-ff60-496f-bdbc-8218b5ceb3f7]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod controller-manager-55c479c949-bxrgx in out of cluster comm: pod \\\"controller-manager-55c479c949-bxrgx\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" podUID="059440a1-ff60-496f-bdbc-8218b5ceb3f7" Mar 01 09:14:03 crc kubenswrapper[4792]: I0301 09:14:03.049148 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 01 09:14:03 crc kubenswrapper[4792]: I0301 09:14:03.094729 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 01 09:14:03 crc kubenswrapper[4792]: I0301 09:14:03.100540 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 01 09:14:03 crc kubenswrapper[4792]: I0301 09:14:03.194703 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 01 09:14:03 crc kubenswrapper[4792]: I0301 09:14:03.217200 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:14:03 crc kubenswrapper[4792]: I0301 09:14:03.217291 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:14:03 crc kubenswrapper[4792]: I0301 09:14:03.217852 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:14:03 crc kubenswrapper[4792]: I0301 09:14:03.218359 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:14:03 crc kubenswrapper[4792]: I0301 09:14:03.308161 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 01 09:14:03 crc kubenswrapper[4792]: I0301 09:14:03.325040 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 01 09:14:03 crc kubenswrapper[4792]: E0301 09:14:03.372740 4792 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 01 09:14:03 crc kubenswrapper[4792]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29539274-zckv2_openshift-infra_81a3bf03-822b-4b69-93a3-b420d8f58efd_0(973c84189d303a057ec7d6896ef4f066682b84dbe6f8f59321aa5c83025f3b78): error adding pod openshift-infra_auto-csr-approver-29539274-zckv2 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"973c84189d303a057ec7d6896ef4f066682b84dbe6f8f59321aa5c83025f3b78" Netns:"/var/run/netns/94f9648b-6ae8-4642-82c8-580fccbe1e89" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29539274-zckv2;K8S_POD_INFRA_CONTAINER_ID=973c84189d303a057ec7d6896ef4f066682b84dbe6f8f59321aa5c83025f3b78;K8S_POD_UID=81a3bf03-822b-4b69-93a3-b420d8f58efd" Path:"" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29539274-zckv2] networking: Multus: [openshift-infra/auto-csr-approver-29539274-zckv2/81a3bf03-822b-4b69-93a3-b420d8f58efd]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod auto-csr-approver-29539274-zckv2 in out of cluster comm: pod "auto-csr-approver-29539274-zckv2" not found Mar 01 09:14:03 crc kubenswrapper[4792]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 01 09:14:03 crc kubenswrapper[4792]: > Mar 01 09:14:03 crc kubenswrapper[4792]: E0301 09:14:03.372821 4792 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 01 09:14:03 crc kubenswrapper[4792]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29539274-zckv2_openshift-infra_81a3bf03-822b-4b69-93a3-b420d8f58efd_0(973c84189d303a057ec7d6896ef4f066682b84dbe6f8f59321aa5c83025f3b78): error adding pod openshift-infra_auto-csr-approver-29539274-zckv2 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"973c84189d303a057ec7d6896ef4f066682b84dbe6f8f59321aa5c83025f3b78" Netns:"/var/run/netns/94f9648b-6ae8-4642-82c8-580fccbe1e89" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29539274-zckv2;K8S_POD_INFRA_CONTAINER_ID=973c84189d303a057ec7d6896ef4f066682b84dbe6f8f59321aa5c83025f3b78;K8S_POD_UID=81a3bf03-822b-4b69-93a3-b420d8f58efd" Path:"" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29539274-zckv2] networking: Multus: [openshift-infra/auto-csr-approver-29539274-zckv2/81a3bf03-822b-4b69-93a3-b420d8f58efd]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod auto-csr-approver-29539274-zckv2 in out of cluster comm: pod "auto-csr-approver-29539274-zckv2" not found Mar 01 09:14:03 crc kubenswrapper[4792]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 01 09:14:03 crc kubenswrapper[4792]: > pod="openshift-infra/auto-csr-approver-29539274-zckv2" Mar 01 09:14:03 crc kubenswrapper[4792]: E0301 09:14:03.372847 4792 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 01 09:14:03 crc kubenswrapper[4792]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29539274-zckv2_openshift-infra_81a3bf03-822b-4b69-93a3-b420d8f58efd_0(973c84189d303a057ec7d6896ef4f066682b84dbe6f8f59321aa5c83025f3b78): error adding pod openshift-infra_auto-csr-approver-29539274-zckv2 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"973c84189d303a057ec7d6896ef4f066682b84dbe6f8f59321aa5c83025f3b78" Netns:"/var/run/netns/94f9648b-6ae8-4642-82c8-580fccbe1e89" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29539274-zckv2;K8S_POD_INFRA_CONTAINER_ID=973c84189d303a057ec7d6896ef4f066682b84dbe6f8f59321aa5c83025f3b78;K8S_POD_UID=81a3bf03-822b-4b69-93a3-b420d8f58efd" Path:"" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29539274-zckv2] networking: Multus: [openshift-infra/auto-csr-approver-29539274-zckv2/81a3bf03-822b-4b69-93a3-b420d8f58efd]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod auto-csr-approver-29539274-zckv2 in out of cluster comm: pod "auto-csr-approver-29539274-zckv2" not found Mar 01 09:14:03 crc kubenswrapper[4792]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 01 09:14:03 crc kubenswrapper[4792]: > pod="openshift-infra/auto-csr-approver-29539274-zckv2" Mar 01 09:14:03 crc kubenswrapper[4792]: E0301 09:14:03.372932 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29539274-zckv2_openshift-infra(81a3bf03-822b-4b69-93a3-b420d8f58efd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29539274-zckv2_openshift-infra(81a3bf03-822b-4b69-93a3-b420d8f58efd)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29539274-zckv2_openshift-infra_81a3bf03-822b-4b69-93a3-b420d8f58efd_0(973c84189d303a057ec7d6896ef4f066682b84dbe6f8f59321aa5c83025f3b78): error adding pod openshift-infra_auto-csr-approver-29539274-zckv2 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"973c84189d303a057ec7d6896ef4f066682b84dbe6f8f59321aa5c83025f3b78\\\" Netns:\\\"/var/run/netns/94f9648b-6ae8-4642-82c8-580fccbe1e89\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29539274-zckv2;K8S_POD_INFRA_CONTAINER_ID=973c84189d303a057ec7d6896ef4f066682b84dbe6f8f59321aa5c83025f3b78;K8S_POD_UID=81a3bf03-822b-4b69-93a3-b420d8f58efd\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29539274-zckv2] networking: Multus: [openshift-infra/auto-csr-approver-29539274-zckv2/81a3bf03-822b-4b69-93a3-b420d8f58efd]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod auto-csr-approver-29539274-zckv2 in out of cluster comm: pod \\\"auto-csr-approver-29539274-zckv2\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-infra/auto-csr-approver-29539274-zckv2" podUID="81a3bf03-822b-4b69-93a3-b420d8f58efd" Mar 01 09:14:03 crc kubenswrapper[4792]: I0301 09:14:03.505953 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 01 09:14:03 crc kubenswrapper[4792]: I0301 09:14:03.571446 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 01 09:14:03 crc kubenswrapper[4792]: I0301 09:14:03.641866 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 01 09:14:03 crc kubenswrapper[4792]: I0301 09:14:03.689825 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 01 09:14:03 crc kubenswrapper[4792]: I0301 09:14:03.929689 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k"] Mar 01 09:14:03 crc kubenswrapper[4792]: I0301 09:14:03.996096 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 01 09:14:04 crc kubenswrapper[4792]: I0301 09:14:04.056411 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-55c479c949-bxrgx"] Mar 01 09:14:04 crc kubenswrapper[4792]: W0301 09:14:04.060022 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod059440a1_ff60_496f_bdbc_8218b5ceb3f7.slice/crio-93588b0b710160330739dce49ef130b3bc798f987a69e63656832318c0b660e9 WatchSource:0}: Error finding container 93588b0b710160330739dce49ef130b3bc798f987a69e63656832318c0b660e9: Status 404 returned error can't find the container with id 93588b0b710160330739dce49ef130b3bc798f987a69e63656832318c0b660e9 Mar 01 09:14:04 crc kubenswrapper[4792]: I0301 09:14:04.166522 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 01 09:14:04 crc kubenswrapper[4792]: I0301 09:14:04.233731 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" event={"ID":"f9ec3613-4fe1-4e71-8991-f5be9a94579e","Type":"ContainerStarted","Data":"1e70c60c4e7c815e1ff19c61b0b52d3e0a670b9a447599618d07c39ce492f957"} Mar 01 09:14:04 crc kubenswrapper[4792]: I0301 09:14:04.234862 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" event={"ID":"059440a1-ff60-496f-bdbc-8218b5ceb3f7","Type":"ContainerStarted","Data":"93588b0b710160330739dce49ef130b3bc798f987a69e63656832318c0b660e9"} Mar 01 09:14:04 crc kubenswrapper[4792]: I0301 09:14:04.235024 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539274-zckv2" Mar 01 09:14:04 crc kubenswrapper[4792]: I0301 09:14:04.235677 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539274-zckv2" Mar 01 09:14:04 crc kubenswrapper[4792]: I0301 09:14:04.425088 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 01 09:14:04 crc kubenswrapper[4792]: I0301 09:14:04.540885 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 01 09:14:04 crc kubenswrapper[4792]: I0301 09:14:04.608320 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 01 09:14:04 crc kubenswrapper[4792]: I0301 09:14:04.627568 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539274-zckv2"] Mar 01 09:14:04 crc kubenswrapper[4792]: W0301 09:14:04.634084 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81a3bf03_822b_4b69_93a3_b420d8f58efd.slice/crio-a3c24e76c9cd27814b70d42d5440e38471f26a7d1011faa03af0f134d3a951a7 WatchSource:0}: Error finding container a3c24e76c9cd27814b70d42d5440e38471f26a7d1011faa03af0f134d3a951a7: Status 404 returned error can't find the container with id a3c24e76c9cd27814b70d42d5440e38471f26a7d1011faa03af0f134d3a951a7 Mar 01 09:14:04 crc kubenswrapper[4792]: I0301 09:14:04.640443 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 01 09:14:04 crc kubenswrapper[4792]: I0301 09:14:04.685695 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 01 09:14:04 crc kubenswrapper[4792]: I0301 09:14:04.756062 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 01 09:14:04 crc kubenswrapper[4792]: I0301 09:14:04.837962 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 01 09:14:04 crc kubenswrapper[4792]: I0301 09:14:04.860196 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 01 09:14:05 crc kubenswrapper[4792]: I0301 09:14:05.003642 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 01 09:14:05 crc kubenswrapper[4792]: I0301 09:14:05.059994 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 01 09:14:05 crc kubenswrapper[4792]: I0301 09:14:05.083048 4792 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 01 09:14:05 crc kubenswrapper[4792]: I0301 09:14:05.099535 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 01 09:14:05 crc kubenswrapper[4792]: I0301 09:14:05.172165 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 01 09:14:05 crc kubenswrapper[4792]: I0301 09:14:05.240576 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" event={"ID":"f9ec3613-4fe1-4e71-8991-f5be9a94579e","Type":"ContainerStarted","Data":"b92fa0430770ca4ba11287fde661a77eb495b1dfe8b0c1b92d46f3d044ca5494"} Mar 01 09:14:05 crc kubenswrapper[4792]: I0301 09:14:05.241669 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:14:05 crc kubenswrapper[4792]: I0301 09:14:05.243276 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539274-zckv2" event={"ID":"81a3bf03-822b-4b69-93a3-b420d8f58efd","Type":"ContainerStarted","Data":"a3c24e76c9cd27814b70d42d5440e38471f26a7d1011faa03af0f134d3a951a7"} Mar 01 09:14:05 crc kubenswrapper[4792]: I0301 09:14:05.245688 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" event={"ID":"059440a1-ff60-496f-bdbc-8218b5ceb3f7","Type":"ContainerStarted","Data":"f0bcfc05997e9dda23611e6b0d7d0d240d9f4f4c0b360a81f6b2c1cb880b601d"} Mar 01 09:14:05 crc kubenswrapper[4792]: I0301 09:14:05.246242 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:14:05 crc kubenswrapper[4792]: I0301 09:14:05.251336 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" Mar 01 09:14:05 crc kubenswrapper[4792]: I0301 09:14:05.253395 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" Mar 01 09:14:05 crc kubenswrapper[4792]: I0301 09:14:05.262791 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-f66f7bb8d-7bp5k" podStartSLOduration=48.262777829 podStartE2EDuration="48.262777829s" podCreationTimestamp="2026-03-01 09:13:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:14:05.261674422 +0000 UTC m=+374.503553619" watchObservedRunningTime="2026-03-01 09:14:05.262777829 +0000 UTC m=+374.504657026" Mar 01 09:14:05 crc kubenswrapper[4792]: I0301 09:14:05.330965 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 01 09:14:05 crc kubenswrapper[4792]: I0301 09:14:05.380150 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 01 09:14:06 crc kubenswrapper[4792]: I0301 09:14:06.251387 4792 generic.go:334] "Generic (PLEG): container finished" podID="81a3bf03-822b-4b69-93a3-b420d8f58efd" containerID="8ef57da9b21fb114ba0f54a09e6174de667175684b505b54b0c846389388b402" exitCode=0 Mar 01 09:14:06 crc kubenswrapper[4792]: I0301 09:14:06.251587 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539274-zckv2" event={"ID":"81a3bf03-822b-4b69-93a3-b420d8f58efd","Type":"ContainerDied","Data":"8ef57da9b21fb114ba0f54a09e6174de667175684b505b54b0c846389388b402"} Mar 01 09:14:06 crc kubenswrapper[4792]: I0301 09:14:06.262872 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-55c479c949-bxrgx" podStartSLOduration=49.262854359 podStartE2EDuration="49.262854359s" podCreationTimestamp="2026-03-01 09:13:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:14:05.354100652 +0000 UTC m=+374.595979849" watchObservedRunningTime="2026-03-01 09:14:06.262854359 +0000 UTC m=+375.504733556" Mar 01 09:14:06 crc kubenswrapper[4792]: I0301 09:14:06.992570 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 01 09:14:06 crc kubenswrapper[4792]: I0301 09:14:06.992640 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.096634 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.096702 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.096741 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.096769 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.096845 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.096838 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.096863 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.096858 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.096884 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.097055 4792 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.097065 4792 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.097073 4792 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.097081 4792 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.103130 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.198014 4792 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.260699 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.260752 4792 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="0053a8c0b15b85cc2122bda95d2798eb0ba27a1e086aa18527efd0c4e0caccae" exitCode=137 Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.260838 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.260949 4792 scope.go:117] "RemoveContainer" containerID="0053a8c0b15b85cc2122bda95d2798eb0ba27a1e086aa18527efd0c4e0caccae" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.279152 4792 scope.go:117] "RemoveContainer" containerID="0053a8c0b15b85cc2122bda95d2798eb0ba27a1e086aa18527efd0c4e0caccae" Mar 01 09:14:07 crc kubenswrapper[4792]: E0301 09:14:07.279575 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0053a8c0b15b85cc2122bda95d2798eb0ba27a1e086aa18527efd0c4e0caccae\": container with ID starting with 0053a8c0b15b85cc2122bda95d2798eb0ba27a1e086aa18527efd0c4e0caccae not found: ID does not exist" containerID="0053a8c0b15b85cc2122bda95d2798eb0ba27a1e086aa18527efd0c4e0caccae" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.279638 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0053a8c0b15b85cc2122bda95d2798eb0ba27a1e086aa18527efd0c4e0caccae"} err="failed to get container status \"0053a8c0b15b85cc2122bda95d2798eb0ba27a1e086aa18527efd0c4e0caccae\": rpc error: code = NotFound desc = could not find container \"0053a8c0b15b85cc2122bda95d2798eb0ba27a1e086aa18527efd0c4e0caccae\": container with ID starting with 0053a8c0b15b85cc2122bda95d2798eb0ba27a1e086aa18527efd0c4e0caccae not found: ID does not exist" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.415658 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.416242 4792 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.429067 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.429110 4792 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="ee49ed71-d637-4156-a077-becc9128fadb" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.433285 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.433334 4792 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="ee49ed71-d637-4156-a077-becc9128fadb" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.488108 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539274-zckv2" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.606483 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnb28\" (UniqueName: \"kubernetes.io/projected/81a3bf03-822b-4b69-93a3-b420d8f58efd-kube-api-access-gnb28\") pod \"81a3bf03-822b-4b69-93a3-b420d8f58efd\" (UID: \"81a3bf03-822b-4b69-93a3-b420d8f58efd\") " Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.610975 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81a3bf03-822b-4b69-93a3-b420d8f58efd-kube-api-access-gnb28" (OuterVolumeSpecName: "kube-api-access-gnb28") pod "81a3bf03-822b-4b69-93a3-b420d8f58efd" (UID: "81a3bf03-822b-4b69-93a3-b420d8f58efd"). InnerVolumeSpecName "kube-api-access-gnb28". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:14:07 crc kubenswrapper[4792]: I0301 09:14:07.707788 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnb28\" (UniqueName: \"kubernetes.io/projected/81a3bf03-822b-4b69-93a3-b420d8f58efd-kube-api-access-gnb28\") on node \"crc\" DevicePath \"\"" Mar 01 09:14:08 crc kubenswrapper[4792]: I0301 09:14:08.270721 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539274-zckv2" event={"ID":"81a3bf03-822b-4b69-93a3-b420d8f58efd","Type":"ContainerDied","Data":"a3c24e76c9cd27814b70d42d5440e38471f26a7d1011faa03af0f134d3a951a7"} Mar 01 09:14:08 crc kubenswrapper[4792]: I0301 09:14:08.270762 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3c24e76c9cd27814b70d42d5440e38471f26a7d1011faa03af0f134d3a951a7" Mar 01 09:14:08 crc kubenswrapper[4792]: I0301 09:14:08.270779 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539274-zckv2" Mar 01 09:14:08 crc kubenswrapper[4792]: I0301 09:14:08.594026 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.137764 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4"] Mar 01 09:15:00 crc kubenswrapper[4792]: E0301 09:15:00.139733 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.139820 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 01 09:15:00 crc kubenswrapper[4792]: E0301 09:15:00.139929 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81a3bf03-822b-4b69-93a3-b420d8f58efd" containerName="oc" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.140019 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="81a3bf03-822b-4b69-93a3-b420d8f58efd" containerName="oc" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.140202 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="81a3bf03-822b-4b69-93a3-b420d8f58efd" containerName="oc" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.140282 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.140697 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.147080 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4"] Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.147647 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.148234 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.156092 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s26sl\" (UniqueName: \"kubernetes.io/projected/8558286c-6cb2-4061-bb84-07803d33b576-kube-api-access-s26sl\") pod \"collect-profiles-29539275-5mhm4\" (UID: \"8558286c-6cb2-4061-bb84-07803d33b576\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.156192 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8558286c-6cb2-4061-bb84-07803d33b576-config-volume\") pod \"collect-profiles-29539275-5mhm4\" (UID: \"8558286c-6cb2-4061-bb84-07803d33b576\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.156215 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8558286c-6cb2-4061-bb84-07803d33b576-secret-volume\") pod \"collect-profiles-29539275-5mhm4\" (UID: \"8558286c-6cb2-4061-bb84-07803d33b576\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.257865 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8558286c-6cb2-4061-bb84-07803d33b576-config-volume\") pod \"collect-profiles-29539275-5mhm4\" (UID: \"8558286c-6cb2-4061-bb84-07803d33b576\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.257938 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8558286c-6cb2-4061-bb84-07803d33b576-secret-volume\") pod \"collect-profiles-29539275-5mhm4\" (UID: \"8558286c-6cb2-4061-bb84-07803d33b576\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.258003 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s26sl\" (UniqueName: \"kubernetes.io/projected/8558286c-6cb2-4061-bb84-07803d33b576-kube-api-access-s26sl\") pod \"collect-profiles-29539275-5mhm4\" (UID: \"8558286c-6cb2-4061-bb84-07803d33b576\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.259012 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8558286c-6cb2-4061-bb84-07803d33b576-config-volume\") pod \"collect-profiles-29539275-5mhm4\" (UID: \"8558286c-6cb2-4061-bb84-07803d33b576\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.264093 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8558286c-6cb2-4061-bb84-07803d33b576-secret-volume\") pod \"collect-profiles-29539275-5mhm4\" (UID: \"8558286c-6cb2-4061-bb84-07803d33b576\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.274335 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s26sl\" (UniqueName: \"kubernetes.io/projected/8558286c-6cb2-4061-bb84-07803d33b576-kube-api-access-s26sl\") pod \"collect-profiles-29539275-5mhm4\" (UID: \"8558286c-6cb2-4061-bb84-07803d33b576\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.456066 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4" Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.643739 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4"] Mar 01 09:15:00 crc kubenswrapper[4792]: I0301 09:15:00.675281 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4" event={"ID":"8558286c-6cb2-4061-bb84-07803d33b576","Type":"ContainerStarted","Data":"fc342858aad6d7fe721366ec61b4a6ce763019dbdc83ff19b9b3acaafecb55fe"} Mar 01 09:15:01 crc kubenswrapper[4792]: I0301 09:15:01.682268 4792 generic.go:334] "Generic (PLEG): container finished" podID="8558286c-6cb2-4061-bb84-07803d33b576" containerID="25d35f30a0bda8efbd3c0227d7f47d3a25118a249c78dad70cafb27c52068acf" exitCode=0 Mar 01 09:15:01 crc kubenswrapper[4792]: I0301 09:15:01.682429 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4" event={"ID":"8558286c-6cb2-4061-bb84-07803d33b576","Type":"ContainerDied","Data":"25d35f30a0bda8efbd3c0227d7f47d3a25118a249c78dad70cafb27c52068acf"} Mar 01 09:15:02 crc kubenswrapper[4792]: I0301 09:15:02.962261 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4" Mar 01 09:15:03 crc kubenswrapper[4792]: I0301 09:15:03.094370 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s26sl\" (UniqueName: \"kubernetes.io/projected/8558286c-6cb2-4061-bb84-07803d33b576-kube-api-access-s26sl\") pod \"8558286c-6cb2-4061-bb84-07803d33b576\" (UID: \"8558286c-6cb2-4061-bb84-07803d33b576\") " Mar 01 09:15:03 crc kubenswrapper[4792]: I0301 09:15:03.094443 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8558286c-6cb2-4061-bb84-07803d33b576-secret-volume\") pod \"8558286c-6cb2-4061-bb84-07803d33b576\" (UID: \"8558286c-6cb2-4061-bb84-07803d33b576\") " Mar 01 09:15:03 crc kubenswrapper[4792]: I0301 09:15:03.094519 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8558286c-6cb2-4061-bb84-07803d33b576-config-volume\") pod \"8558286c-6cb2-4061-bb84-07803d33b576\" (UID: \"8558286c-6cb2-4061-bb84-07803d33b576\") " Mar 01 09:15:03 crc kubenswrapper[4792]: I0301 09:15:03.095579 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8558286c-6cb2-4061-bb84-07803d33b576-config-volume" (OuterVolumeSpecName: "config-volume") pod "8558286c-6cb2-4061-bb84-07803d33b576" (UID: "8558286c-6cb2-4061-bb84-07803d33b576"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:15:03 crc kubenswrapper[4792]: I0301 09:15:03.099891 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8558286c-6cb2-4061-bb84-07803d33b576-kube-api-access-s26sl" (OuterVolumeSpecName: "kube-api-access-s26sl") pod "8558286c-6cb2-4061-bb84-07803d33b576" (UID: "8558286c-6cb2-4061-bb84-07803d33b576"). InnerVolumeSpecName "kube-api-access-s26sl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:15:03 crc kubenswrapper[4792]: I0301 09:15:03.109046 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8558286c-6cb2-4061-bb84-07803d33b576-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8558286c-6cb2-4061-bb84-07803d33b576" (UID: "8558286c-6cb2-4061-bb84-07803d33b576"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:15:03 crc kubenswrapper[4792]: I0301 09:15:03.196253 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s26sl\" (UniqueName: \"kubernetes.io/projected/8558286c-6cb2-4061-bb84-07803d33b576-kube-api-access-s26sl\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:03 crc kubenswrapper[4792]: I0301 09:15:03.196290 4792 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8558286c-6cb2-4061-bb84-07803d33b576-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:03 crc kubenswrapper[4792]: I0301 09:15:03.196300 4792 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8558286c-6cb2-4061-bb84-07803d33b576-config-volume\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:03 crc kubenswrapper[4792]: I0301 09:15:03.692665 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4" event={"ID":"8558286c-6cb2-4061-bb84-07803d33b576","Type":"ContainerDied","Data":"fc342858aad6d7fe721366ec61b4a6ce763019dbdc83ff19b9b3acaafecb55fe"} Mar 01 09:15:03 crc kubenswrapper[4792]: I0301 09:15:03.692705 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc342858aad6d7fe721366ec61b4a6ce763019dbdc83ff19b9b3acaafecb55fe" Mar 01 09:15:03 crc kubenswrapper[4792]: I0301 09:15:03.692736 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4" Mar 01 09:15:10 crc kubenswrapper[4792]: I0301 09:15:10.482041 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p6b9w"] Mar 01 09:15:10 crc kubenswrapper[4792]: I0301 09:15:10.482808 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p6b9w" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" containerName="registry-server" containerID="cri-o://9b1ebd40c76d6a05f41f90e7e6ff9f66cc3e0ca75b2cbd8232430f1380f165ea" gracePeriod=2 Mar 01 09:15:10 crc kubenswrapper[4792]: I0301 09:15:10.735142 4792 generic.go:334] "Generic (PLEG): container finished" podID="6fd91972-6bfc-4041-abc2-8f4298584603" containerID="9b1ebd40c76d6a05f41f90e7e6ff9f66cc3e0ca75b2cbd8232430f1380f165ea" exitCode=0 Mar 01 09:15:10 crc kubenswrapper[4792]: I0301 09:15:10.735182 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6b9w" event={"ID":"6fd91972-6bfc-4041-abc2-8f4298584603","Type":"ContainerDied","Data":"9b1ebd40c76d6a05f41f90e7e6ff9f66cc3e0ca75b2cbd8232430f1380f165ea"} Mar 01 09:15:10 crc kubenswrapper[4792]: I0301 09:15:10.918198 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p6b9w" Mar 01 09:15:11 crc kubenswrapper[4792]: I0301 09:15:11.033519 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fd91972-6bfc-4041-abc2-8f4298584603-catalog-content\") pod \"6fd91972-6bfc-4041-abc2-8f4298584603\" (UID: \"6fd91972-6bfc-4041-abc2-8f4298584603\") " Mar 01 09:15:11 crc kubenswrapper[4792]: I0301 09:15:11.033558 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fd91972-6bfc-4041-abc2-8f4298584603-utilities\") pod \"6fd91972-6bfc-4041-abc2-8f4298584603\" (UID: \"6fd91972-6bfc-4041-abc2-8f4298584603\") " Mar 01 09:15:11 crc kubenswrapper[4792]: I0301 09:15:11.033619 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whj6h\" (UniqueName: \"kubernetes.io/projected/6fd91972-6bfc-4041-abc2-8f4298584603-kube-api-access-whj6h\") pod \"6fd91972-6bfc-4041-abc2-8f4298584603\" (UID: \"6fd91972-6bfc-4041-abc2-8f4298584603\") " Mar 01 09:15:11 crc kubenswrapper[4792]: I0301 09:15:11.034352 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fd91972-6bfc-4041-abc2-8f4298584603-utilities" (OuterVolumeSpecName: "utilities") pod "6fd91972-6bfc-4041-abc2-8f4298584603" (UID: "6fd91972-6bfc-4041-abc2-8f4298584603"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:15:11 crc kubenswrapper[4792]: I0301 09:15:11.047115 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fd91972-6bfc-4041-abc2-8f4298584603-kube-api-access-whj6h" (OuterVolumeSpecName: "kube-api-access-whj6h") pod "6fd91972-6bfc-4041-abc2-8f4298584603" (UID: "6fd91972-6bfc-4041-abc2-8f4298584603"). InnerVolumeSpecName "kube-api-access-whj6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:15:11 crc kubenswrapper[4792]: I0301 09:15:11.084735 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fd91972-6bfc-4041-abc2-8f4298584603-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6fd91972-6bfc-4041-abc2-8f4298584603" (UID: "6fd91972-6bfc-4041-abc2-8f4298584603"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:15:11 crc kubenswrapper[4792]: I0301 09:15:11.134730 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whj6h\" (UniqueName: \"kubernetes.io/projected/6fd91972-6bfc-4041-abc2-8f4298584603-kube-api-access-whj6h\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:11 crc kubenswrapper[4792]: I0301 09:15:11.134764 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fd91972-6bfc-4041-abc2-8f4298584603-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:11 crc kubenswrapper[4792]: I0301 09:15:11.134774 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fd91972-6bfc-4041-abc2-8f4298584603-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:11 crc kubenswrapper[4792]: I0301 09:15:11.744941 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6b9w" event={"ID":"6fd91972-6bfc-4041-abc2-8f4298584603","Type":"ContainerDied","Data":"477ff8111ce50f202ac20e7801f2ac3bd82c5ed546b9e6aeee69367ec0d09908"} Mar 01 09:15:11 crc kubenswrapper[4792]: I0301 09:15:11.744998 4792 scope.go:117] "RemoveContainer" containerID="9b1ebd40c76d6a05f41f90e7e6ff9f66cc3e0ca75b2cbd8232430f1380f165ea" Mar 01 09:15:11 crc kubenswrapper[4792]: I0301 09:15:11.745216 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p6b9w" Mar 01 09:15:11 crc kubenswrapper[4792]: I0301 09:15:11.761729 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p6b9w"] Mar 01 09:15:11 crc kubenswrapper[4792]: I0301 09:15:11.768810 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p6b9w"] Mar 01 09:15:11 crc kubenswrapper[4792]: I0301 09:15:11.769333 4792 scope.go:117] "RemoveContainer" containerID="303602554c4af893edadfe462be0a7b315bc51a29028b034dfe41a16ef93ff3e" Mar 01 09:15:11 crc kubenswrapper[4792]: I0301 09:15:11.787416 4792 scope.go:117] "RemoveContainer" containerID="aa50e0a49246a84ff0433571a1c37df20be8eda2033b2980a1562df09196f210" Mar 01 09:15:13 crc kubenswrapper[4792]: I0301 09:15:13.418755 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" path="/var/lib/kubelet/pods/6fd91972-6bfc-4041-abc2-8f4298584603/volumes" Mar 01 09:15:34 crc kubenswrapper[4792]: I0301 09:15:34.943325 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:15:34 crc kubenswrapper[4792]: I0301 09:15:34.944075 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.344661 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-5rg6z"] Mar 01 09:15:36 crc kubenswrapper[4792]: E0301 09:15:36.345133 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" containerName="extract-content" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.345145 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" containerName="extract-content" Mar 01 09:15:36 crc kubenswrapper[4792]: E0301 09:15:36.345159 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8558286c-6cb2-4061-bb84-07803d33b576" containerName="collect-profiles" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.345166 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8558286c-6cb2-4061-bb84-07803d33b576" containerName="collect-profiles" Mar 01 09:15:36 crc kubenswrapper[4792]: E0301 09:15:36.345178 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" containerName="extract-utilities" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.345185 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" containerName="extract-utilities" Mar 01 09:15:36 crc kubenswrapper[4792]: E0301 09:15:36.345192 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" containerName="registry-server" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.345198 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" containerName="registry-server" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.345282 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="8558286c-6cb2-4061-bb84-07803d33b576" containerName="collect-profiles" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.345297 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fd91972-6bfc-4041-abc2-8f4298584603" containerName="registry-server" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.345644 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.364208 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-5rg6z"] Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.449153 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5345b06a-b353-4d3b-aee6-3e69c35a6325-registry-tls\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.449202 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttznf\" (UniqueName: \"kubernetes.io/projected/5345b06a-b353-4d3b-aee6-3e69c35a6325-kube-api-access-ttznf\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.449243 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.449282 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5345b06a-b353-4d3b-aee6-3e69c35a6325-ca-trust-extracted\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.449313 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5345b06a-b353-4d3b-aee6-3e69c35a6325-bound-sa-token\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.449371 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5345b06a-b353-4d3b-aee6-3e69c35a6325-registry-certificates\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.449403 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5345b06a-b353-4d3b-aee6-3e69c35a6325-trusted-ca\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.449455 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5345b06a-b353-4d3b-aee6-3e69c35a6325-installation-pull-secrets\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.474688 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.550634 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5345b06a-b353-4d3b-aee6-3e69c35a6325-ca-trust-extracted\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.550771 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5345b06a-b353-4d3b-aee6-3e69c35a6325-bound-sa-token\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.551516 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5345b06a-b353-4d3b-aee6-3e69c35a6325-ca-trust-extracted\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.551572 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5345b06a-b353-4d3b-aee6-3e69c35a6325-registry-certificates\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.551599 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5345b06a-b353-4d3b-aee6-3e69c35a6325-trusted-ca\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.551636 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5345b06a-b353-4d3b-aee6-3e69c35a6325-installation-pull-secrets\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.551677 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5345b06a-b353-4d3b-aee6-3e69c35a6325-registry-tls\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.551726 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttznf\" (UniqueName: \"kubernetes.io/projected/5345b06a-b353-4d3b-aee6-3e69c35a6325-kube-api-access-ttznf\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.553500 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5345b06a-b353-4d3b-aee6-3e69c35a6325-trusted-ca\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.555319 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5345b06a-b353-4d3b-aee6-3e69c35a6325-registry-certificates\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.558277 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5345b06a-b353-4d3b-aee6-3e69c35a6325-registry-tls\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.563630 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5345b06a-b353-4d3b-aee6-3e69c35a6325-installation-pull-secrets\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.565733 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5345b06a-b353-4d3b-aee6-3e69c35a6325-bound-sa-token\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.572836 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttznf\" (UniqueName: \"kubernetes.io/projected/5345b06a-b353-4d3b-aee6-3e69c35a6325-kube-api-access-ttznf\") pod \"image-registry-66df7c8f76-5rg6z\" (UID: \"5345b06a-b353-4d3b-aee6-3e69c35a6325\") " pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:36 crc kubenswrapper[4792]: I0301 09:15:36.663008 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:37 crc kubenswrapper[4792]: I0301 09:15:37.097094 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-5rg6z"] Mar 01 09:15:37 crc kubenswrapper[4792]: I0301 09:15:37.881252 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" event={"ID":"5345b06a-b353-4d3b-aee6-3e69c35a6325","Type":"ContainerStarted","Data":"b943412d0fc1e2de925dfa8818689ec70ad2378afa3ec77e5b5c25996cd150c9"} Mar 01 09:15:37 crc kubenswrapper[4792]: I0301 09:15:37.881754 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:37 crc kubenswrapper[4792]: I0301 09:15:37.881786 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" event={"ID":"5345b06a-b353-4d3b-aee6-3e69c35a6325","Type":"ContainerStarted","Data":"796761eff49436efc7fb008294beb4a0f81a2f40a24492a018407426479ccf5b"} Mar 01 09:15:37 crc kubenswrapper[4792]: I0301 09:15:37.903872 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" podStartSLOduration=1.903842274 podStartE2EDuration="1.903842274s" podCreationTimestamp="2026-03-01 09:15:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:15:37.89813619 +0000 UTC m=+467.140015387" watchObservedRunningTime="2026-03-01 09:15:37.903842274 +0000 UTC m=+467.145721491" Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.738385 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cw675"] Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.739001 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cw675" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" containerName="registry-server" containerID="cri-o://5bb5a5f0949169743626acf007f4aa939920851854703451752f35c1714d5b63" gracePeriod=30 Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.745176 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wxb87"] Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.745445 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wxb87" podUID="9073e3da-2d6f-48a3-907a-e347f28559ae" containerName="registry-server" containerID="cri-o://9dea4572b2cae8fb3cd81b4d5fe3e648bf003be8e11b2fe3e243bf7601f66f32" gracePeriod=30 Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.754319 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gk6c6"] Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.754653 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" podUID="e37e6dcb-be13-4787-8555-3ba1050f7b77" containerName="marketplace-operator" containerID="cri-o://8c78eddf897c2741ce992ebcf0cd416c60d059664ef1623df2239b0550809bd0" gracePeriod=30 Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.774859 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n28r8"] Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.775102 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-n28r8" podUID="e03dee1d-d7ca-422c-8af6-faa0c1af3863" containerName="registry-server" containerID="cri-o://1897e3e89e2859f0dcfe575b896a35cdde57822147e478674713823e7d25153f" gracePeriod=30 Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.786297 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gfkbs"] Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.787044 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gfkbs" Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.788522 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7nwc2"] Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.788769 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7nwc2" podUID="22c7c368-3523-4224-aebd-59b29640bed0" containerName="registry-server" containerID="cri-o://6796f3254defaa157a9986218237b224b14d5a54cd6e180bf82e9634212462ec" gracePeriod=30 Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.816631 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gfkbs"] Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.931591 4792 generic.go:334] "Generic (PLEG): container finished" podID="e37e6dcb-be13-4787-8555-3ba1050f7b77" containerID="8c78eddf897c2741ce992ebcf0cd416c60d059664ef1623df2239b0550809bd0" exitCode=0 Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.931931 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" event={"ID":"e37e6dcb-be13-4787-8555-3ba1050f7b77","Type":"ContainerDied","Data":"8c78eddf897c2741ce992ebcf0cd416c60d059664ef1623df2239b0550809bd0"} Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.956794 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/46fe59e7-8122-4621-ae8d-237a91daee5e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gfkbs\" (UID: \"46fe59e7-8122-4621-ae8d-237a91daee5e\") " pod="openshift-marketplace/marketplace-operator-79b997595-gfkbs" Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.956847 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/46fe59e7-8122-4621-ae8d-237a91daee5e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gfkbs\" (UID: \"46fe59e7-8122-4621-ae8d-237a91daee5e\") " pod="openshift-marketplace/marketplace-operator-79b997595-gfkbs" Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.956949 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6stz4\" (UniqueName: \"kubernetes.io/projected/46fe59e7-8122-4621-ae8d-237a91daee5e-kube-api-access-6stz4\") pod \"marketplace-operator-79b997595-gfkbs\" (UID: \"46fe59e7-8122-4621-ae8d-237a91daee5e\") " pod="openshift-marketplace/marketplace-operator-79b997595-gfkbs" Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.959000 4792 generic.go:334] "Generic (PLEG): container finished" podID="9073e3da-2d6f-48a3-907a-e347f28559ae" containerID="9dea4572b2cae8fb3cd81b4d5fe3e648bf003be8e11b2fe3e243bf7601f66f32" exitCode=0 Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.959090 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wxb87" event={"ID":"9073e3da-2d6f-48a3-907a-e347f28559ae","Type":"ContainerDied","Data":"9dea4572b2cae8fb3cd81b4d5fe3e648bf003be8e11b2fe3e243bf7601f66f32"} Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.969676 4792 generic.go:334] "Generic (PLEG): container finished" podID="dff0d675-52dd-4cac-a7be-8750333c28e3" containerID="5bb5a5f0949169743626acf007f4aa939920851854703451752f35c1714d5b63" exitCode=0 Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.969732 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cw675" event={"ID":"dff0d675-52dd-4cac-a7be-8750333c28e3","Type":"ContainerDied","Data":"5bb5a5f0949169743626acf007f4aa939920851854703451752f35c1714d5b63"} Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.975041 4792 generic.go:334] "Generic (PLEG): container finished" podID="e03dee1d-d7ca-422c-8af6-faa0c1af3863" containerID="1897e3e89e2859f0dcfe575b896a35cdde57822147e478674713823e7d25153f" exitCode=0 Mar 01 09:15:43 crc kubenswrapper[4792]: I0301 09:15:43.975085 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n28r8" event={"ID":"e03dee1d-d7ca-422c-8af6-faa0c1af3863","Type":"ContainerDied","Data":"1897e3e89e2859f0dcfe575b896a35cdde57822147e478674713823e7d25153f"} Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.071528 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6stz4\" (UniqueName: \"kubernetes.io/projected/46fe59e7-8122-4621-ae8d-237a91daee5e-kube-api-access-6stz4\") pod \"marketplace-operator-79b997595-gfkbs\" (UID: \"46fe59e7-8122-4621-ae8d-237a91daee5e\") " pod="openshift-marketplace/marketplace-operator-79b997595-gfkbs" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.071585 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/46fe59e7-8122-4621-ae8d-237a91daee5e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gfkbs\" (UID: \"46fe59e7-8122-4621-ae8d-237a91daee5e\") " pod="openshift-marketplace/marketplace-operator-79b997595-gfkbs" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.071606 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/46fe59e7-8122-4621-ae8d-237a91daee5e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gfkbs\" (UID: \"46fe59e7-8122-4621-ae8d-237a91daee5e\") " pod="openshift-marketplace/marketplace-operator-79b997595-gfkbs" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.074675 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/46fe59e7-8122-4621-ae8d-237a91daee5e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gfkbs\" (UID: \"46fe59e7-8122-4621-ae8d-237a91daee5e\") " pod="openshift-marketplace/marketplace-operator-79b997595-gfkbs" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.090875 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/46fe59e7-8122-4621-ae8d-237a91daee5e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gfkbs\" (UID: \"46fe59e7-8122-4621-ae8d-237a91daee5e\") " pod="openshift-marketplace/marketplace-operator-79b997595-gfkbs" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.097393 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6stz4\" (UniqueName: \"kubernetes.io/projected/46fe59e7-8122-4621-ae8d-237a91daee5e-kube-api-access-6stz4\") pod \"marketplace-operator-79b997595-gfkbs\" (UID: \"46fe59e7-8122-4621-ae8d-237a91daee5e\") " pod="openshift-marketplace/marketplace-operator-79b997595-gfkbs" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.113178 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gfkbs" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.217125 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cw675" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.298563 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wxb87" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.305186 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n28r8" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.307012 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.374389 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thnsb\" (UniqueName: \"kubernetes.io/projected/dff0d675-52dd-4cac-a7be-8750333c28e3-kube-api-access-thnsb\") pod \"dff0d675-52dd-4cac-a7be-8750333c28e3\" (UID: \"dff0d675-52dd-4cac-a7be-8750333c28e3\") " Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.374434 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dff0d675-52dd-4cac-a7be-8750333c28e3-catalog-content\") pod \"dff0d675-52dd-4cac-a7be-8750333c28e3\" (UID: \"dff0d675-52dd-4cac-a7be-8750333c28e3\") " Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.374529 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dff0d675-52dd-4cac-a7be-8750333c28e3-utilities\") pod \"dff0d675-52dd-4cac-a7be-8750333c28e3\" (UID: \"dff0d675-52dd-4cac-a7be-8750333c28e3\") " Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.375417 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dff0d675-52dd-4cac-a7be-8750333c28e3-utilities" (OuterVolumeSpecName: "utilities") pod "dff0d675-52dd-4cac-a7be-8750333c28e3" (UID: "dff0d675-52dd-4cac-a7be-8750333c28e3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.379356 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dff0d675-52dd-4cac-a7be-8750333c28e3-kube-api-access-thnsb" (OuterVolumeSpecName: "kube-api-access-thnsb") pod "dff0d675-52dd-4cac-a7be-8750333c28e3" (UID: "dff0d675-52dd-4cac-a7be-8750333c28e3"). InnerVolumeSpecName "kube-api-access-thnsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.379566 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7nwc2" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.448877 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dff0d675-52dd-4cac-a7be-8750333c28e3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dff0d675-52dd-4cac-a7be-8750333c28e3" (UID: "dff0d675-52dd-4cac-a7be-8750333c28e3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.475461 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nc28k\" (UniqueName: \"kubernetes.io/projected/9073e3da-2d6f-48a3-907a-e347f28559ae-kube-api-access-nc28k\") pod \"9073e3da-2d6f-48a3-907a-e347f28559ae\" (UID: \"9073e3da-2d6f-48a3-907a-e347f28559ae\") " Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.475506 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22c7c368-3523-4224-aebd-59b29640bed0-catalog-content\") pod \"22c7c368-3523-4224-aebd-59b29640bed0\" (UID: \"22c7c368-3523-4224-aebd-59b29640bed0\") " Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.475527 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e37e6dcb-be13-4787-8555-3ba1050f7b77-marketplace-trusted-ca\") pod \"e37e6dcb-be13-4787-8555-3ba1050f7b77\" (UID: \"e37e6dcb-be13-4787-8555-3ba1050f7b77\") " Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.475547 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9073e3da-2d6f-48a3-907a-e347f28559ae-utilities\") pod \"9073e3da-2d6f-48a3-907a-e347f28559ae\" (UID: \"9073e3da-2d6f-48a3-907a-e347f28559ae\") " Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.475579 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22c7c368-3523-4224-aebd-59b29640bed0-utilities\") pod \"22c7c368-3523-4224-aebd-59b29640bed0\" (UID: \"22c7c368-3523-4224-aebd-59b29640bed0\") " Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.475615 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kc5gl\" (UniqueName: \"kubernetes.io/projected/e03dee1d-d7ca-422c-8af6-faa0c1af3863-kube-api-access-kc5gl\") pod \"e03dee1d-d7ca-422c-8af6-faa0c1af3863\" (UID: \"e03dee1d-d7ca-422c-8af6-faa0c1af3863\") " Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.475634 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9073e3da-2d6f-48a3-907a-e347f28559ae-catalog-content\") pod \"9073e3da-2d6f-48a3-907a-e347f28559ae\" (UID: \"9073e3da-2d6f-48a3-907a-e347f28559ae\") " Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.475661 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e03dee1d-d7ca-422c-8af6-faa0c1af3863-catalog-content\") pod \"e03dee1d-d7ca-422c-8af6-faa0c1af3863\" (UID: \"e03dee1d-d7ca-422c-8af6-faa0c1af3863\") " Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.475685 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e37e6dcb-be13-4787-8555-3ba1050f7b77-marketplace-operator-metrics\") pod \"e37e6dcb-be13-4787-8555-3ba1050f7b77\" (UID: \"e37e6dcb-be13-4787-8555-3ba1050f7b77\") " Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.475712 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e03dee1d-d7ca-422c-8af6-faa0c1af3863-utilities\") pod \"e03dee1d-d7ca-422c-8af6-faa0c1af3863\" (UID: \"e03dee1d-d7ca-422c-8af6-faa0c1af3863\") " Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.475737 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djdkr\" (UniqueName: \"kubernetes.io/projected/22c7c368-3523-4224-aebd-59b29640bed0-kube-api-access-djdkr\") pod \"22c7c368-3523-4224-aebd-59b29640bed0\" (UID: \"22c7c368-3523-4224-aebd-59b29640bed0\") " Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.475771 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7pkt\" (UniqueName: \"kubernetes.io/projected/e37e6dcb-be13-4787-8555-3ba1050f7b77-kube-api-access-r7pkt\") pod \"e37e6dcb-be13-4787-8555-3ba1050f7b77\" (UID: \"e37e6dcb-be13-4787-8555-3ba1050f7b77\") " Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.475963 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dff0d675-52dd-4cac-a7be-8750333c28e3-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.475976 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thnsb\" (UniqueName: \"kubernetes.io/projected/dff0d675-52dd-4cac-a7be-8750333c28e3-kube-api-access-thnsb\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.475988 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dff0d675-52dd-4cac-a7be-8750333c28e3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.477690 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22c7c368-3523-4224-aebd-59b29640bed0-utilities" (OuterVolumeSpecName: "utilities") pod "22c7c368-3523-4224-aebd-59b29640bed0" (UID: "22c7c368-3523-4224-aebd-59b29640bed0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.478944 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9073e3da-2d6f-48a3-907a-e347f28559ae-kube-api-access-nc28k" (OuterVolumeSpecName: "kube-api-access-nc28k") pod "9073e3da-2d6f-48a3-907a-e347f28559ae" (UID: "9073e3da-2d6f-48a3-907a-e347f28559ae"). InnerVolumeSpecName "kube-api-access-nc28k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.478976 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e03dee1d-d7ca-422c-8af6-faa0c1af3863-utilities" (OuterVolumeSpecName: "utilities") pod "e03dee1d-d7ca-422c-8af6-faa0c1af3863" (UID: "e03dee1d-d7ca-422c-8af6-faa0c1af3863"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.480412 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e03dee1d-d7ca-422c-8af6-faa0c1af3863-kube-api-access-kc5gl" (OuterVolumeSpecName: "kube-api-access-kc5gl") pod "e03dee1d-d7ca-422c-8af6-faa0c1af3863" (UID: "e03dee1d-d7ca-422c-8af6-faa0c1af3863"). InnerVolumeSpecName "kube-api-access-kc5gl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.482296 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c7c368-3523-4224-aebd-59b29640bed0-kube-api-access-djdkr" (OuterVolumeSpecName: "kube-api-access-djdkr") pod "22c7c368-3523-4224-aebd-59b29640bed0" (UID: "22c7c368-3523-4224-aebd-59b29640bed0"). InnerVolumeSpecName "kube-api-access-djdkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.482932 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9073e3da-2d6f-48a3-907a-e347f28559ae-utilities" (OuterVolumeSpecName: "utilities") pod "9073e3da-2d6f-48a3-907a-e347f28559ae" (UID: "9073e3da-2d6f-48a3-907a-e347f28559ae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.483356 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e37e6dcb-be13-4787-8555-3ba1050f7b77-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "e37e6dcb-be13-4787-8555-3ba1050f7b77" (UID: "e37e6dcb-be13-4787-8555-3ba1050f7b77"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.476995 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e37e6dcb-be13-4787-8555-3ba1050f7b77-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "e37e6dcb-be13-4787-8555-3ba1050f7b77" (UID: "e37e6dcb-be13-4787-8555-3ba1050f7b77"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.490750 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e37e6dcb-be13-4787-8555-3ba1050f7b77-kube-api-access-r7pkt" (OuterVolumeSpecName: "kube-api-access-r7pkt") pod "e37e6dcb-be13-4787-8555-3ba1050f7b77" (UID: "e37e6dcb-be13-4787-8555-3ba1050f7b77"). InnerVolumeSpecName "kube-api-access-r7pkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.505602 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e03dee1d-d7ca-422c-8af6-faa0c1af3863-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e03dee1d-d7ca-422c-8af6-faa0c1af3863" (UID: "e03dee1d-d7ca-422c-8af6-faa0c1af3863"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.533606 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9073e3da-2d6f-48a3-907a-e347f28559ae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9073e3da-2d6f-48a3-907a-e347f28559ae" (UID: "9073e3da-2d6f-48a3-907a-e347f28559ae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.580557 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7pkt\" (UniqueName: \"kubernetes.io/projected/e37e6dcb-be13-4787-8555-3ba1050f7b77-kube-api-access-r7pkt\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.580583 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nc28k\" (UniqueName: \"kubernetes.io/projected/9073e3da-2d6f-48a3-907a-e347f28559ae-kube-api-access-nc28k\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.580595 4792 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e37e6dcb-be13-4787-8555-3ba1050f7b77-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.580605 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9073e3da-2d6f-48a3-907a-e347f28559ae-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.580614 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22c7c368-3523-4224-aebd-59b29640bed0-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.580624 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kc5gl\" (UniqueName: \"kubernetes.io/projected/e03dee1d-d7ca-422c-8af6-faa0c1af3863-kube-api-access-kc5gl\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.580632 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9073e3da-2d6f-48a3-907a-e347f28559ae-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.580640 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e03dee1d-d7ca-422c-8af6-faa0c1af3863-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.580649 4792 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e37e6dcb-be13-4787-8555-3ba1050f7b77-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.580657 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e03dee1d-d7ca-422c-8af6-faa0c1af3863-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.580666 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djdkr\" (UniqueName: \"kubernetes.io/projected/22c7c368-3523-4224-aebd-59b29640bed0-kube-api-access-djdkr\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.603736 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22c7c368-3523-4224-aebd-59b29640bed0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "22c7c368-3523-4224-aebd-59b29640bed0" (UID: "22c7c368-3523-4224-aebd-59b29640bed0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.682069 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22c7c368-3523-4224-aebd-59b29640bed0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.702592 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gfkbs"] Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.990283 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" event={"ID":"e37e6dcb-be13-4787-8555-3ba1050f7b77","Type":"ContainerDied","Data":"5fe7291196c34fc28ce9dd0b3ec6175bb85545475a2aafe649770e45dc76a617"} Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.990341 4792 scope.go:117] "RemoveContainer" containerID="8c78eddf897c2741ce992ebcf0cd416c60d059664ef1623df2239b0550809bd0" Mar 01 09:15:44 crc kubenswrapper[4792]: I0301 09:15:44.990428 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gk6c6" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.000539 4792 generic.go:334] "Generic (PLEG): container finished" podID="22c7c368-3523-4224-aebd-59b29640bed0" containerID="6796f3254defaa157a9986218237b224b14d5a54cd6e180bf82e9634212462ec" exitCode=0 Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.000580 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7nwc2" event={"ID":"22c7c368-3523-4224-aebd-59b29640bed0","Type":"ContainerDied","Data":"6796f3254defaa157a9986218237b224b14d5a54cd6e180bf82e9634212462ec"} Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.000612 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7nwc2" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.000636 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7nwc2" event={"ID":"22c7c368-3523-4224-aebd-59b29640bed0","Type":"ContainerDied","Data":"52cac0b41f9f5231fc4b99f244ac1d2bd2e66edf0a6640e4bac8c95c340ce560"} Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.003491 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wxb87" event={"ID":"9073e3da-2d6f-48a3-907a-e347f28559ae","Type":"ContainerDied","Data":"f1bc5d15012443ccc9990bce32da439c2d26cd8bac7e62fa7b81593bfe925710"} Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.003578 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wxb87" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.018199 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cw675" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.018506 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cw675" event={"ID":"dff0d675-52dd-4cac-a7be-8750333c28e3","Type":"ContainerDied","Data":"339bd57c1cc13f4119c941d59da1a1ea961d6edda5147620d400c9a04f2e8ceb"} Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.021225 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n28r8" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.021236 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n28r8" event={"ID":"e03dee1d-d7ca-422c-8af6-faa0c1af3863","Type":"ContainerDied","Data":"385e5f3c0c4fe34a9ca54a291cae71ccbb846b542d87341115b79874eadd771a"} Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.021306 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gk6c6"] Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.025055 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gfkbs" event={"ID":"46fe59e7-8122-4621-ae8d-237a91daee5e","Type":"ContainerStarted","Data":"42cd5bea5c56421991c632f84b06664b84686faff6b0f1d3b86dd94bdc098c36"} Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.025088 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gfkbs" event={"ID":"46fe59e7-8122-4621-ae8d-237a91daee5e","Type":"ContainerStarted","Data":"1236fb768a5238a78e1ed5ceed2499ea772b0349c833723d91b36ba9abaee434"} Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.027365 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-gfkbs" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.027435 4792 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-gfkbs container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.71:8080/healthz\": dial tcp 10.217.0.71:8080: connect: connection refused" start-of-body= Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.027459 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-gfkbs" podUID="46fe59e7-8122-4621-ae8d-237a91daee5e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.71:8080/healthz\": dial tcp 10.217.0.71:8080: connect: connection refused" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.028113 4792 scope.go:117] "RemoveContainer" containerID="6796f3254defaa157a9986218237b224b14d5a54cd6e180bf82e9634212462ec" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.038376 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gk6c6"] Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.050361 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-gfkbs" podStartSLOduration=2.050339321 podStartE2EDuration="2.050339321s" podCreationTimestamp="2026-03-01 09:15:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:15:45.045328674 +0000 UTC m=+474.287207871" watchObservedRunningTime="2026-03-01 09:15:45.050339321 +0000 UTC m=+474.292218518" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.064046 4792 scope.go:117] "RemoveContainer" containerID="225f228a4c6b7dec18ef0a0e39f231e76893c213f11c4f32a59ac86f3a689ed1" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.071245 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7nwc2"] Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.076771 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7nwc2"] Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.087522 4792 scope.go:117] "RemoveContainer" containerID="821167de69ed6dbed932952dc938c85a7e5a7b97b04ac4c8184f76750cd36e46" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.096639 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cw675"] Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.106743 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cw675"] Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.114683 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n28r8"] Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.114751 4792 scope.go:117] "RemoveContainer" containerID="6796f3254defaa157a9986218237b224b14d5a54cd6e180bf82e9634212462ec" Mar 01 09:15:45 crc kubenswrapper[4792]: E0301 09:15:45.115255 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6796f3254defaa157a9986218237b224b14d5a54cd6e180bf82e9634212462ec\": container with ID starting with 6796f3254defaa157a9986218237b224b14d5a54cd6e180bf82e9634212462ec not found: ID does not exist" containerID="6796f3254defaa157a9986218237b224b14d5a54cd6e180bf82e9634212462ec" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.115291 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6796f3254defaa157a9986218237b224b14d5a54cd6e180bf82e9634212462ec"} err="failed to get container status \"6796f3254defaa157a9986218237b224b14d5a54cd6e180bf82e9634212462ec\": rpc error: code = NotFound desc = could not find container \"6796f3254defaa157a9986218237b224b14d5a54cd6e180bf82e9634212462ec\": container with ID starting with 6796f3254defaa157a9986218237b224b14d5a54cd6e180bf82e9634212462ec not found: ID does not exist" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.115315 4792 scope.go:117] "RemoveContainer" containerID="225f228a4c6b7dec18ef0a0e39f231e76893c213f11c4f32a59ac86f3a689ed1" Mar 01 09:15:45 crc kubenswrapper[4792]: E0301 09:15:45.115556 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"225f228a4c6b7dec18ef0a0e39f231e76893c213f11c4f32a59ac86f3a689ed1\": container with ID starting with 225f228a4c6b7dec18ef0a0e39f231e76893c213f11c4f32a59ac86f3a689ed1 not found: ID does not exist" containerID="225f228a4c6b7dec18ef0a0e39f231e76893c213f11c4f32a59ac86f3a689ed1" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.115581 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"225f228a4c6b7dec18ef0a0e39f231e76893c213f11c4f32a59ac86f3a689ed1"} err="failed to get container status \"225f228a4c6b7dec18ef0a0e39f231e76893c213f11c4f32a59ac86f3a689ed1\": rpc error: code = NotFound desc = could not find container \"225f228a4c6b7dec18ef0a0e39f231e76893c213f11c4f32a59ac86f3a689ed1\": container with ID starting with 225f228a4c6b7dec18ef0a0e39f231e76893c213f11c4f32a59ac86f3a689ed1 not found: ID does not exist" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.115595 4792 scope.go:117] "RemoveContainer" containerID="821167de69ed6dbed932952dc938c85a7e5a7b97b04ac4c8184f76750cd36e46" Mar 01 09:15:45 crc kubenswrapper[4792]: E0301 09:15:45.115761 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"821167de69ed6dbed932952dc938c85a7e5a7b97b04ac4c8184f76750cd36e46\": container with ID starting with 821167de69ed6dbed932952dc938c85a7e5a7b97b04ac4c8184f76750cd36e46 not found: ID does not exist" containerID="821167de69ed6dbed932952dc938c85a7e5a7b97b04ac4c8184f76750cd36e46" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.115780 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"821167de69ed6dbed932952dc938c85a7e5a7b97b04ac4c8184f76750cd36e46"} err="failed to get container status \"821167de69ed6dbed932952dc938c85a7e5a7b97b04ac4c8184f76750cd36e46\": rpc error: code = NotFound desc = could not find container \"821167de69ed6dbed932952dc938c85a7e5a7b97b04ac4c8184f76750cd36e46\": container with ID starting with 821167de69ed6dbed932952dc938c85a7e5a7b97b04ac4c8184f76750cd36e46 not found: ID does not exist" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.115792 4792 scope.go:117] "RemoveContainer" containerID="9dea4572b2cae8fb3cd81b4d5fe3e648bf003be8e11b2fe3e243bf7601f66f32" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.116821 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-n28r8"] Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.133208 4792 scope.go:117] "RemoveContainer" containerID="4b4748f2f641b6c36501b4a63c23878d8149ab901a48ba6ca75290682667f801" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.138961 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wxb87"] Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.142042 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wxb87"] Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.157820 4792 scope.go:117] "RemoveContainer" containerID="cabcf6681e53e945ec2b29b89c123577ba10471c42575321a6f1da2be37ffe2a" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.173379 4792 scope.go:117] "RemoveContainer" containerID="5bb5a5f0949169743626acf007f4aa939920851854703451752f35c1714d5b63" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.186840 4792 scope.go:117] "RemoveContainer" containerID="60c5c14884f306ec02b79c73b52f33a5df66be0f71db3ee1b928c0b932dd06d6" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.227021 4792 scope.go:117] "RemoveContainer" containerID="2793d9d53a76fa1bceb0317b96453e19780c382a91f9c7fd6f680978a1c2a121" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.241086 4792 scope.go:117] "RemoveContainer" containerID="1897e3e89e2859f0dcfe575b896a35cdde57822147e478674713823e7d25153f" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.254094 4792 scope.go:117] "RemoveContainer" containerID="d7a22eb25032f48508905d8110fc2779a9b8e1d8380aa44e93999853b14a1f56" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.268930 4792 scope.go:117] "RemoveContainer" containerID="0c84982bc61501954eca9c9293432f6ed8f745c671471318271733558785ba39" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.414741 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c7c368-3523-4224-aebd-59b29640bed0" path="/var/lib/kubelet/pods/22c7c368-3523-4224-aebd-59b29640bed0/volumes" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.415493 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9073e3da-2d6f-48a3-907a-e347f28559ae" path="/var/lib/kubelet/pods/9073e3da-2d6f-48a3-907a-e347f28559ae/volumes" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.416055 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" path="/var/lib/kubelet/pods/dff0d675-52dd-4cac-a7be-8750333c28e3/volumes" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.417019 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e03dee1d-d7ca-422c-8af6-faa0c1af3863" path="/var/lib/kubelet/pods/e03dee1d-d7ca-422c-8af6-faa0c1af3863/volumes" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.417594 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e37e6dcb-be13-4787-8555-3ba1050f7b77" path="/var/lib/kubelet/pods/e37e6dcb-be13-4787-8555-3ba1050f7b77/volumes" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.952307 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zjvjw"] Mar 01 09:15:45 crc kubenswrapper[4792]: E0301 09:15:45.953605 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e37e6dcb-be13-4787-8555-3ba1050f7b77" containerName="marketplace-operator" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.953624 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e37e6dcb-be13-4787-8555-3ba1050f7b77" containerName="marketplace-operator" Mar 01 09:15:45 crc kubenswrapper[4792]: E0301 09:15:45.953636 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e03dee1d-d7ca-422c-8af6-faa0c1af3863" containerName="extract-utilities" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.953644 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e03dee1d-d7ca-422c-8af6-faa0c1af3863" containerName="extract-utilities" Mar 01 09:15:45 crc kubenswrapper[4792]: E0301 09:15:45.953660 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" containerName="extract-utilities" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.953667 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" containerName="extract-utilities" Mar 01 09:15:45 crc kubenswrapper[4792]: E0301 09:15:45.953677 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22c7c368-3523-4224-aebd-59b29640bed0" containerName="extract-utilities" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.953685 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="22c7c368-3523-4224-aebd-59b29640bed0" containerName="extract-utilities" Mar 01 09:15:45 crc kubenswrapper[4792]: E0301 09:15:45.953694 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" containerName="registry-server" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.953701 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" containerName="registry-server" Mar 01 09:15:45 crc kubenswrapper[4792]: E0301 09:15:45.953716 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22c7c368-3523-4224-aebd-59b29640bed0" containerName="registry-server" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.953724 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="22c7c368-3523-4224-aebd-59b29640bed0" containerName="registry-server" Mar 01 09:15:45 crc kubenswrapper[4792]: E0301 09:15:45.953733 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e03dee1d-d7ca-422c-8af6-faa0c1af3863" containerName="extract-content" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.953740 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e03dee1d-d7ca-422c-8af6-faa0c1af3863" containerName="extract-content" Mar 01 09:15:45 crc kubenswrapper[4792]: E0301 09:15:45.953752 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9073e3da-2d6f-48a3-907a-e347f28559ae" containerName="extract-utilities" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.953759 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9073e3da-2d6f-48a3-907a-e347f28559ae" containerName="extract-utilities" Mar 01 09:15:45 crc kubenswrapper[4792]: E0301 09:15:45.953768 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" containerName="extract-content" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.953777 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" containerName="extract-content" Mar 01 09:15:45 crc kubenswrapper[4792]: E0301 09:15:45.953789 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9073e3da-2d6f-48a3-907a-e347f28559ae" containerName="extract-content" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.953796 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9073e3da-2d6f-48a3-907a-e347f28559ae" containerName="extract-content" Mar 01 09:15:45 crc kubenswrapper[4792]: E0301 09:15:45.953809 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22c7c368-3523-4224-aebd-59b29640bed0" containerName="extract-content" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.953816 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="22c7c368-3523-4224-aebd-59b29640bed0" containerName="extract-content" Mar 01 09:15:45 crc kubenswrapper[4792]: E0301 09:15:45.953824 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e03dee1d-d7ca-422c-8af6-faa0c1af3863" containerName="registry-server" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.953832 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e03dee1d-d7ca-422c-8af6-faa0c1af3863" containerName="registry-server" Mar 01 09:15:45 crc kubenswrapper[4792]: E0301 09:15:45.953842 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9073e3da-2d6f-48a3-907a-e347f28559ae" containerName="registry-server" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.953849 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9073e3da-2d6f-48a3-907a-e347f28559ae" containerName="registry-server" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.953963 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="dff0d675-52dd-4cac-a7be-8750333c28e3" containerName="registry-server" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.953980 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e37e6dcb-be13-4787-8555-3ba1050f7b77" containerName="marketplace-operator" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.953989 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9073e3da-2d6f-48a3-907a-e347f28559ae" containerName="registry-server" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.954000 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e03dee1d-d7ca-422c-8af6-faa0c1af3863" containerName="registry-server" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.954012 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="22c7c368-3523-4224-aebd-59b29640bed0" containerName="registry-server" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.955740 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zjvjw" Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.956833 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zjvjw"] Mar 01 09:15:45 crc kubenswrapper[4792]: I0301 09:15:45.957723 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.038212 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-gfkbs" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.101830 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxjb6\" (UniqueName: \"kubernetes.io/projected/3003e690-c3dd-4236-a95c-a0fb6ccb438e-kube-api-access-hxjb6\") pod \"redhat-marketplace-zjvjw\" (UID: \"3003e690-c3dd-4236-a95c-a0fb6ccb438e\") " pod="openshift-marketplace/redhat-marketplace-zjvjw" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.101931 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3003e690-c3dd-4236-a95c-a0fb6ccb438e-utilities\") pod \"redhat-marketplace-zjvjw\" (UID: \"3003e690-c3dd-4236-a95c-a0fb6ccb438e\") " pod="openshift-marketplace/redhat-marketplace-zjvjw" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.101958 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3003e690-c3dd-4236-a95c-a0fb6ccb438e-catalog-content\") pod \"redhat-marketplace-zjvjw\" (UID: \"3003e690-c3dd-4236-a95c-a0fb6ccb438e\") " pod="openshift-marketplace/redhat-marketplace-zjvjw" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.151945 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7fdqh"] Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.152878 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7fdqh" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.154674 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.161419 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7fdqh"] Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.202967 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3003e690-c3dd-4236-a95c-a0fb6ccb438e-utilities\") pod \"redhat-marketplace-zjvjw\" (UID: \"3003e690-c3dd-4236-a95c-a0fb6ccb438e\") " pod="openshift-marketplace/redhat-marketplace-zjvjw" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.203014 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3003e690-c3dd-4236-a95c-a0fb6ccb438e-catalog-content\") pod \"redhat-marketplace-zjvjw\" (UID: \"3003e690-c3dd-4236-a95c-a0fb6ccb438e\") " pod="openshift-marketplace/redhat-marketplace-zjvjw" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.203063 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxjb6\" (UniqueName: \"kubernetes.io/projected/3003e690-c3dd-4236-a95c-a0fb6ccb438e-kube-api-access-hxjb6\") pod \"redhat-marketplace-zjvjw\" (UID: \"3003e690-c3dd-4236-a95c-a0fb6ccb438e\") " pod="openshift-marketplace/redhat-marketplace-zjvjw" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.203509 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3003e690-c3dd-4236-a95c-a0fb6ccb438e-utilities\") pod \"redhat-marketplace-zjvjw\" (UID: \"3003e690-c3dd-4236-a95c-a0fb6ccb438e\") " pod="openshift-marketplace/redhat-marketplace-zjvjw" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.203739 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3003e690-c3dd-4236-a95c-a0fb6ccb438e-catalog-content\") pod \"redhat-marketplace-zjvjw\" (UID: \"3003e690-c3dd-4236-a95c-a0fb6ccb438e\") " pod="openshift-marketplace/redhat-marketplace-zjvjw" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.220647 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxjb6\" (UniqueName: \"kubernetes.io/projected/3003e690-c3dd-4236-a95c-a0fb6ccb438e-kube-api-access-hxjb6\") pod \"redhat-marketplace-zjvjw\" (UID: \"3003e690-c3dd-4236-a95c-a0fb6ccb438e\") " pod="openshift-marketplace/redhat-marketplace-zjvjw" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.282998 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zjvjw" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.304369 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38bc0c09-286e-427a-95c2-8e2c9213b142-catalog-content\") pod \"redhat-operators-7fdqh\" (UID: \"38bc0c09-286e-427a-95c2-8e2c9213b142\") " pod="openshift-marketplace/redhat-operators-7fdqh" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.304620 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bwgl\" (UniqueName: \"kubernetes.io/projected/38bc0c09-286e-427a-95c2-8e2c9213b142-kube-api-access-4bwgl\") pod \"redhat-operators-7fdqh\" (UID: \"38bc0c09-286e-427a-95c2-8e2c9213b142\") " pod="openshift-marketplace/redhat-operators-7fdqh" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.304665 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38bc0c09-286e-427a-95c2-8e2c9213b142-utilities\") pod \"redhat-operators-7fdqh\" (UID: \"38bc0c09-286e-427a-95c2-8e2c9213b142\") " pod="openshift-marketplace/redhat-operators-7fdqh" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.406104 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bwgl\" (UniqueName: \"kubernetes.io/projected/38bc0c09-286e-427a-95c2-8e2c9213b142-kube-api-access-4bwgl\") pod \"redhat-operators-7fdqh\" (UID: \"38bc0c09-286e-427a-95c2-8e2c9213b142\") " pod="openshift-marketplace/redhat-operators-7fdqh" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.406148 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38bc0c09-286e-427a-95c2-8e2c9213b142-catalog-content\") pod \"redhat-operators-7fdqh\" (UID: \"38bc0c09-286e-427a-95c2-8e2c9213b142\") " pod="openshift-marketplace/redhat-operators-7fdqh" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.406197 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38bc0c09-286e-427a-95c2-8e2c9213b142-utilities\") pod \"redhat-operators-7fdqh\" (UID: \"38bc0c09-286e-427a-95c2-8e2c9213b142\") " pod="openshift-marketplace/redhat-operators-7fdqh" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.406751 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38bc0c09-286e-427a-95c2-8e2c9213b142-utilities\") pod \"redhat-operators-7fdqh\" (UID: \"38bc0c09-286e-427a-95c2-8e2c9213b142\") " pod="openshift-marketplace/redhat-operators-7fdqh" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.406769 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38bc0c09-286e-427a-95c2-8e2c9213b142-catalog-content\") pod \"redhat-operators-7fdqh\" (UID: \"38bc0c09-286e-427a-95c2-8e2c9213b142\") " pod="openshift-marketplace/redhat-operators-7fdqh" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.433272 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bwgl\" (UniqueName: \"kubernetes.io/projected/38bc0c09-286e-427a-95c2-8e2c9213b142-kube-api-access-4bwgl\") pod \"redhat-operators-7fdqh\" (UID: \"38bc0c09-286e-427a-95c2-8e2c9213b142\") " pod="openshift-marketplace/redhat-operators-7fdqh" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.479427 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7fdqh" Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.649347 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7fdqh"] Mar 01 09:15:46 crc kubenswrapper[4792]: W0301 09:15:46.662631 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38bc0c09_286e_427a_95c2_8e2c9213b142.slice/crio-11328d282ef0220b6921dadaddb7e58797ad6ade046d4093abc16c9afd96b991 WatchSource:0}: Error finding container 11328d282ef0220b6921dadaddb7e58797ad6ade046d4093abc16c9afd96b991: Status 404 returned error can't find the container with id 11328d282ef0220b6921dadaddb7e58797ad6ade046d4093abc16c9afd96b991 Mar 01 09:15:46 crc kubenswrapper[4792]: I0301 09:15:46.684675 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zjvjw"] Mar 01 09:15:46 crc kubenswrapper[4792]: W0301 09:15:46.692593 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3003e690_c3dd_4236_a95c_a0fb6ccb438e.slice/crio-75f1c5432baa41cfe8d19814ad008dcc8898229ebacd240889398863afc86cc0 WatchSource:0}: Error finding container 75f1c5432baa41cfe8d19814ad008dcc8898229ebacd240889398863afc86cc0: Status 404 returned error can't find the container with id 75f1c5432baa41cfe8d19814ad008dcc8898229ebacd240889398863afc86cc0 Mar 01 09:15:47 crc kubenswrapper[4792]: I0301 09:15:47.041539 4792 generic.go:334] "Generic (PLEG): container finished" podID="38bc0c09-286e-427a-95c2-8e2c9213b142" containerID="c13d3b38df8892b424aaefbf200315183e532ad4e85474cdcebdfa2a35e8d0f9" exitCode=0 Mar 01 09:15:47 crc kubenswrapper[4792]: I0301 09:15:47.041924 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fdqh" event={"ID":"38bc0c09-286e-427a-95c2-8e2c9213b142","Type":"ContainerDied","Data":"c13d3b38df8892b424aaefbf200315183e532ad4e85474cdcebdfa2a35e8d0f9"} Mar 01 09:15:47 crc kubenswrapper[4792]: I0301 09:15:47.041955 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fdqh" event={"ID":"38bc0c09-286e-427a-95c2-8e2c9213b142","Type":"ContainerStarted","Data":"11328d282ef0220b6921dadaddb7e58797ad6ade046d4093abc16c9afd96b991"} Mar 01 09:15:47 crc kubenswrapper[4792]: I0301 09:15:47.052410 4792 generic.go:334] "Generic (PLEG): container finished" podID="3003e690-c3dd-4236-a95c-a0fb6ccb438e" containerID="c4c9b2c9b9a9f0f3a82ec42aed5a463fb7f5caafa0cf3d2f1487056d63745338" exitCode=0 Mar 01 09:15:47 crc kubenswrapper[4792]: I0301 09:15:47.053363 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zjvjw" event={"ID":"3003e690-c3dd-4236-a95c-a0fb6ccb438e","Type":"ContainerDied","Data":"c4c9b2c9b9a9f0f3a82ec42aed5a463fb7f5caafa0cf3d2f1487056d63745338"} Mar 01 09:15:47 crc kubenswrapper[4792]: I0301 09:15:47.053416 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zjvjw" event={"ID":"3003e690-c3dd-4236-a95c-a0fb6ccb438e","Type":"ContainerStarted","Data":"75f1c5432baa41cfe8d19814ad008dcc8898229ebacd240889398863afc86cc0"} Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.059270 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fdqh" event={"ID":"38bc0c09-286e-427a-95c2-8e2c9213b142","Type":"ContainerStarted","Data":"839dbd9c94d016359cf8c4184d8a4654289151b502be0d350e0f115d237e1290"} Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.064770 4792 generic.go:334] "Generic (PLEG): container finished" podID="3003e690-c3dd-4236-a95c-a0fb6ccb438e" containerID="63da93f364346bdc218e269d6eedf40c892f902c3beea9faf5fa777a6feba6b1" exitCode=0 Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.064810 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zjvjw" event={"ID":"3003e690-c3dd-4236-a95c-a0fb6ccb438e","Type":"ContainerDied","Data":"63da93f364346bdc218e269d6eedf40c892f902c3beea9faf5fa777a6feba6b1"} Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.355580 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sksw8"] Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.356482 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sksw8" Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.360230 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.407566 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sksw8"] Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.530543 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55c448d6-e926-4b07-8aec-8195d42d2e30-utilities\") pod \"certified-operators-sksw8\" (UID: \"55c448d6-e926-4b07-8aec-8195d42d2e30\") " pod="openshift-marketplace/certified-operators-sksw8" Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.530639 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55c448d6-e926-4b07-8aec-8195d42d2e30-catalog-content\") pod \"certified-operators-sksw8\" (UID: \"55c448d6-e926-4b07-8aec-8195d42d2e30\") " pod="openshift-marketplace/certified-operators-sksw8" Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.530737 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d6xs\" (UniqueName: \"kubernetes.io/projected/55c448d6-e926-4b07-8aec-8195d42d2e30-kube-api-access-2d6xs\") pod \"certified-operators-sksw8\" (UID: \"55c448d6-e926-4b07-8aec-8195d42d2e30\") " pod="openshift-marketplace/certified-operators-sksw8" Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.558244 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-48zdf"] Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.559154 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-48zdf" Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.561154 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.564477 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-48zdf"] Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.632351 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55c448d6-e926-4b07-8aec-8195d42d2e30-utilities\") pod \"certified-operators-sksw8\" (UID: \"55c448d6-e926-4b07-8aec-8195d42d2e30\") " pod="openshift-marketplace/certified-operators-sksw8" Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.632539 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55c448d6-e926-4b07-8aec-8195d42d2e30-catalog-content\") pod \"certified-operators-sksw8\" (UID: \"55c448d6-e926-4b07-8aec-8195d42d2e30\") " pod="openshift-marketplace/certified-operators-sksw8" Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.632659 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d6xs\" (UniqueName: \"kubernetes.io/projected/55c448d6-e926-4b07-8aec-8195d42d2e30-kube-api-access-2d6xs\") pod \"certified-operators-sksw8\" (UID: \"55c448d6-e926-4b07-8aec-8195d42d2e30\") " pod="openshift-marketplace/certified-operators-sksw8" Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.633511 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55c448d6-e926-4b07-8aec-8195d42d2e30-utilities\") pod \"certified-operators-sksw8\" (UID: \"55c448d6-e926-4b07-8aec-8195d42d2e30\") " pod="openshift-marketplace/certified-operators-sksw8" Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.633610 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55c448d6-e926-4b07-8aec-8195d42d2e30-catalog-content\") pod \"certified-operators-sksw8\" (UID: \"55c448d6-e926-4b07-8aec-8195d42d2e30\") " pod="openshift-marketplace/certified-operators-sksw8" Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.651644 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d6xs\" (UniqueName: \"kubernetes.io/projected/55c448d6-e926-4b07-8aec-8195d42d2e30-kube-api-access-2d6xs\") pod \"certified-operators-sksw8\" (UID: \"55c448d6-e926-4b07-8aec-8195d42d2e30\") " pod="openshift-marketplace/certified-operators-sksw8" Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.679965 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sksw8" Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.733807 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d875f1af-e90b-4882-b472-f91651d468a6-catalog-content\") pod \"community-operators-48zdf\" (UID: \"d875f1af-e90b-4882-b472-f91651d468a6\") " pod="openshift-marketplace/community-operators-48zdf" Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.733853 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz5td\" (UniqueName: \"kubernetes.io/projected/d875f1af-e90b-4882-b472-f91651d468a6-kube-api-access-hz5td\") pod \"community-operators-48zdf\" (UID: \"d875f1af-e90b-4882-b472-f91651d468a6\") " pod="openshift-marketplace/community-operators-48zdf" Mar 01 09:15:48 crc kubenswrapper[4792]: I0301 09:15:48.734089 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d875f1af-e90b-4882-b472-f91651d468a6-utilities\") pod \"community-operators-48zdf\" (UID: \"d875f1af-e90b-4882-b472-f91651d468a6\") " pod="openshift-marketplace/community-operators-48zdf" Mar 01 09:15:49 crc kubenswrapper[4792]: I0301 09:15:48.835546 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d875f1af-e90b-4882-b472-f91651d468a6-utilities\") pod \"community-operators-48zdf\" (UID: \"d875f1af-e90b-4882-b472-f91651d468a6\") " pod="openshift-marketplace/community-operators-48zdf" Mar 01 09:15:49 crc kubenswrapper[4792]: I0301 09:15:48.836534 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d875f1af-e90b-4882-b472-f91651d468a6-catalog-content\") pod \"community-operators-48zdf\" (UID: \"d875f1af-e90b-4882-b472-f91651d468a6\") " pod="openshift-marketplace/community-operators-48zdf" Mar 01 09:15:49 crc kubenswrapper[4792]: I0301 09:15:48.836846 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz5td\" (UniqueName: \"kubernetes.io/projected/d875f1af-e90b-4882-b472-f91651d468a6-kube-api-access-hz5td\") pod \"community-operators-48zdf\" (UID: \"d875f1af-e90b-4882-b472-f91651d468a6\") " pod="openshift-marketplace/community-operators-48zdf" Mar 01 09:15:49 crc kubenswrapper[4792]: I0301 09:15:48.836462 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d875f1af-e90b-4882-b472-f91651d468a6-utilities\") pod \"community-operators-48zdf\" (UID: \"d875f1af-e90b-4882-b472-f91651d468a6\") " pod="openshift-marketplace/community-operators-48zdf" Mar 01 09:15:49 crc kubenswrapper[4792]: I0301 09:15:48.836819 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d875f1af-e90b-4882-b472-f91651d468a6-catalog-content\") pod \"community-operators-48zdf\" (UID: \"d875f1af-e90b-4882-b472-f91651d468a6\") " pod="openshift-marketplace/community-operators-48zdf" Mar 01 09:15:49 crc kubenswrapper[4792]: I0301 09:15:48.856289 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz5td\" (UniqueName: \"kubernetes.io/projected/d875f1af-e90b-4882-b472-f91651d468a6-kube-api-access-hz5td\") pod \"community-operators-48zdf\" (UID: \"d875f1af-e90b-4882-b472-f91651d468a6\") " pod="openshift-marketplace/community-operators-48zdf" Mar 01 09:15:49 crc kubenswrapper[4792]: I0301 09:15:48.891863 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sksw8"] Mar 01 09:15:49 crc kubenswrapper[4792]: I0301 09:15:48.896051 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-48zdf" Mar 01 09:15:49 crc kubenswrapper[4792]: W0301 09:15:48.897026 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55c448d6_e926_4b07_8aec_8195d42d2e30.slice/crio-7188e1badb0fe92bbaddbc026f4711272ea58838c819448b2a409d67c5885dbc WatchSource:0}: Error finding container 7188e1badb0fe92bbaddbc026f4711272ea58838c819448b2a409d67c5885dbc: Status 404 returned error can't find the container with id 7188e1badb0fe92bbaddbc026f4711272ea58838c819448b2a409d67c5885dbc Mar 01 09:15:49 crc kubenswrapper[4792]: I0301 09:15:49.072436 4792 generic.go:334] "Generic (PLEG): container finished" podID="38bc0c09-286e-427a-95c2-8e2c9213b142" containerID="839dbd9c94d016359cf8c4184d8a4654289151b502be0d350e0f115d237e1290" exitCode=0 Mar 01 09:15:49 crc kubenswrapper[4792]: I0301 09:15:49.072516 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fdqh" event={"ID":"38bc0c09-286e-427a-95c2-8e2c9213b142","Type":"ContainerDied","Data":"839dbd9c94d016359cf8c4184d8a4654289151b502be0d350e0f115d237e1290"} Mar 01 09:15:49 crc kubenswrapper[4792]: I0301 09:15:49.078458 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zjvjw" event={"ID":"3003e690-c3dd-4236-a95c-a0fb6ccb438e","Type":"ContainerStarted","Data":"5b9418a53311ff310523f43d73f4a3f2bea95acfbee1c9d2d5e16f15273d74b6"} Mar 01 09:15:49 crc kubenswrapper[4792]: I0301 09:15:49.082493 4792 generic.go:334] "Generic (PLEG): container finished" podID="55c448d6-e926-4b07-8aec-8195d42d2e30" containerID="69f02268daf6c0c583cf7f502874715d5874c2b78b768e3cbfdb267828eb7cd1" exitCode=0 Mar 01 09:15:49 crc kubenswrapper[4792]: I0301 09:15:49.082548 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sksw8" event={"ID":"55c448d6-e926-4b07-8aec-8195d42d2e30","Type":"ContainerDied","Data":"69f02268daf6c0c583cf7f502874715d5874c2b78b768e3cbfdb267828eb7cd1"} Mar 01 09:15:49 crc kubenswrapper[4792]: I0301 09:15:49.082566 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sksw8" event={"ID":"55c448d6-e926-4b07-8aec-8195d42d2e30","Type":"ContainerStarted","Data":"7188e1badb0fe92bbaddbc026f4711272ea58838c819448b2a409d67c5885dbc"} Mar 01 09:15:49 crc kubenswrapper[4792]: I0301 09:15:49.909869 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zjvjw" podStartSLOduration=3.474931759 podStartE2EDuration="4.909850191s" podCreationTimestamp="2026-03-01 09:15:45 +0000 UTC" firstStartedPulling="2026-03-01 09:15:47.054833933 +0000 UTC m=+476.296713130" lastFinishedPulling="2026-03-01 09:15:48.489752365 +0000 UTC m=+477.731631562" observedRunningTime="2026-03-01 09:15:49.137040021 +0000 UTC m=+478.378919218" watchObservedRunningTime="2026-03-01 09:15:49.909850191 +0000 UTC m=+479.151729388" Mar 01 09:15:49 crc kubenswrapper[4792]: I0301 09:15:49.912669 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-48zdf"] Mar 01 09:15:49 crc kubenswrapper[4792]: W0301 09:15:49.921075 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd875f1af_e90b_4882_b472_f91651d468a6.slice/crio-e28c7ca553ea613c89182637d5fa5350ca10a96c5efd6579efcdfacaa062ae3a WatchSource:0}: Error finding container e28c7ca553ea613c89182637d5fa5350ca10a96c5efd6579efcdfacaa062ae3a: Status 404 returned error can't find the container with id e28c7ca553ea613c89182637d5fa5350ca10a96c5efd6579efcdfacaa062ae3a Mar 01 09:15:50 crc kubenswrapper[4792]: I0301 09:15:50.090943 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-48zdf" event={"ID":"d875f1af-e90b-4882-b472-f91651d468a6","Type":"ContainerDied","Data":"38e219276e0ef85d6a1ed88e5e195f9432165d376cd2b7270a896c136d1f9c41"} Mar 01 09:15:50 crc kubenswrapper[4792]: I0301 09:15:50.090889 4792 generic.go:334] "Generic (PLEG): container finished" podID="d875f1af-e90b-4882-b472-f91651d468a6" containerID="38e219276e0ef85d6a1ed88e5e195f9432165d376cd2b7270a896c136d1f9c41" exitCode=0 Mar 01 09:15:50 crc kubenswrapper[4792]: I0301 09:15:50.091099 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-48zdf" event={"ID":"d875f1af-e90b-4882-b472-f91651d468a6","Type":"ContainerStarted","Data":"e28c7ca553ea613c89182637d5fa5350ca10a96c5efd6579efcdfacaa062ae3a"} Mar 01 09:15:50 crc kubenswrapper[4792]: I0301 09:15:50.094000 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fdqh" event={"ID":"38bc0c09-286e-427a-95c2-8e2c9213b142","Type":"ContainerStarted","Data":"c64c9d5273ae6c0bdebf9e4e070eb4a5f2be4f9396bee3796f8bb287aa3dfa8a"} Mar 01 09:15:50 crc kubenswrapper[4792]: I0301 09:15:50.097938 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sksw8" event={"ID":"55c448d6-e926-4b07-8aec-8195d42d2e30","Type":"ContainerStarted","Data":"d3850228aa71b2c0cb870102a30e8019bafab1c2433eb51d1d2f07b39afa4c78"} Mar 01 09:15:50 crc kubenswrapper[4792]: I0301 09:15:50.127718 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7fdqh" podStartSLOduration=1.698284781 podStartE2EDuration="4.127702859s" podCreationTimestamp="2026-03-01 09:15:46 +0000 UTC" firstStartedPulling="2026-03-01 09:15:47.043222669 +0000 UTC m=+476.285101866" lastFinishedPulling="2026-03-01 09:15:49.472640747 +0000 UTC m=+478.714519944" observedRunningTime="2026-03-01 09:15:50.126068198 +0000 UTC m=+479.367947395" watchObservedRunningTime="2026-03-01 09:15:50.127702859 +0000 UTC m=+479.369582056" Mar 01 09:15:51 crc kubenswrapper[4792]: I0301 09:15:51.103485 4792 generic.go:334] "Generic (PLEG): container finished" podID="55c448d6-e926-4b07-8aec-8195d42d2e30" containerID="d3850228aa71b2c0cb870102a30e8019bafab1c2433eb51d1d2f07b39afa4c78" exitCode=0 Mar 01 09:15:51 crc kubenswrapper[4792]: I0301 09:15:51.103553 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sksw8" event={"ID":"55c448d6-e926-4b07-8aec-8195d42d2e30","Type":"ContainerDied","Data":"d3850228aa71b2c0cb870102a30e8019bafab1c2433eb51d1d2f07b39afa4c78"} Mar 01 09:15:51 crc kubenswrapper[4792]: I0301 09:15:51.103582 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sksw8" event={"ID":"55c448d6-e926-4b07-8aec-8195d42d2e30","Type":"ContainerStarted","Data":"6ad93f8a6eda0fc3085969611c1dfac7079fe4c76d1610f5bd0732cb92d39fa4"} Mar 01 09:15:51 crc kubenswrapper[4792]: I0301 09:15:51.105487 4792 generic.go:334] "Generic (PLEG): container finished" podID="d875f1af-e90b-4882-b472-f91651d468a6" containerID="760877f9a7d61beb560ecc8cd400c7c3b45e280e1fd92ca600a9762634d49799" exitCode=0 Mar 01 09:15:51 crc kubenswrapper[4792]: I0301 09:15:51.105532 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-48zdf" event={"ID":"d875f1af-e90b-4882-b472-f91651d468a6","Type":"ContainerDied","Data":"760877f9a7d61beb560ecc8cd400c7c3b45e280e1fd92ca600a9762634d49799"} Mar 01 09:15:51 crc kubenswrapper[4792]: I0301 09:15:51.128543 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sksw8" podStartSLOduration=1.497088545 podStartE2EDuration="3.128528165s" podCreationTimestamp="2026-03-01 09:15:48 +0000 UTC" firstStartedPulling="2026-03-01 09:15:49.083560269 +0000 UTC m=+478.325439466" lastFinishedPulling="2026-03-01 09:15:50.714999899 +0000 UTC m=+479.956879086" observedRunningTime="2026-03-01 09:15:51.123117268 +0000 UTC m=+480.364996465" watchObservedRunningTime="2026-03-01 09:15:51.128528165 +0000 UTC m=+480.370407362" Mar 01 09:15:52 crc kubenswrapper[4792]: I0301 09:15:52.114318 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-48zdf" event={"ID":"d875f1af-e90b-4882-b472-f91651d468a6","Type":"ContainerStarted","Data":"a01d67a8724088d0e436e85a9a9d642be57e702c9cd58c95fb5b239c0a0717e0"} Mar 01 09:15:52 crc kubenswrapper[4792]: I0301 09:15:52.136636 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-48zdf" podStartSLOduration=2.702169674 podStartE2EDuration="4.136603544s" podCreationTimestamp="2026-03-01 09:15:48 +0000 UTC" firstStartedPulling="2026-03-01 09:15:50.092385776 +0000 UTC m=+479.334264963" lastFinishedPulling="2026-03-01 09:15:51.526819646 +0000 UTC m=+480.768698833" observedRunningTime="2026-03-01 09:15:52.134241674 +0000 UTC m=+481.376120861" watchObservedRunningTime="2026-03-01 09:15:52.136603544 +0000 UTC m=+481.378482751" Mar 01 09:15:56 crc kubenswrapper[4792]: I0301 09:15:56.283498 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zjvjw" Mar 01 09:15:56 crc kubenswrapper[4792]: I0301 09:15:56.284610 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zjvjw" Mar 01 09:15:56 crc kubenswrapper[4792]: I0301 09:15:56.324830 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zjvjw" Mar 01 09:15:56 crc kubenswrapper[4792]: I0301 09:15:56.480159 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7fdqh" Mar 01 09:15:56 crc kubenswrapper[4792]: I0301 09:15:56.480208 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7fdqh" Mar 01 09:15:56 crc kubenswrapper[4792]: I0301 09:15:56.515997 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7fdqh" Mar 01 09:15:56 crc kubenswrapper[4792]: I0301 09:15:56.670008 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-5rg6z" Mar 01 09:15:56 crc kubenswrapper[4792]: I0301 09:15:56.725027 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4jwnr"] Mar 01 09:15:57 crc kubenswrapper[4792]: I0301 09:15:57.190022 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7fdqh" Mar 01 09:15:57 crc kubenswrapper[4792]: I0301 09:15:57.205734 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zjvjw" Mar 01 09:15:58 crc kubenswrapper[4792]: I0301 09:15:58.680532 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sksw8" Mar 01 09:15:58 crc kubenswrapper[4792]: I0301 09:15:58.681203 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sksw8" Mar 01 09:15:58 crc kubenswrapper[4792]: I0301 09:15:58.745538 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sksw8" Mar 01 09:15:58 crc kubenswrapper[4792]: I0301 09:15:58.896840 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-48zdf" Mar 01 09:15:58 crc kubenswrapper[4792]: I0301 09:15:58.897123 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-48zdf" Mar 01 09:15:58 crc kubenswrapper[4792]: I0301 09:15:58.943522 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-48zdf" Mar 01 09:15:59 crc kubenswrapper[4792]: I0301 09:15:59.180898 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sksw8" Mar 01 09:15:59 crc kubenswrapper[4792]: I0301 09:15:59.191565 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-48zdf" Mar 01 09:16:00 crc kubenswrapper[4792]: I0301 09:16:00.140266 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539276-sbq86"] Mar 01 09:16:00 crc kubenswrapper[4792]: I0301 09:16:00.141397 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539276-sbq86" Mar 01 09:16:00 crc kubenswrapper[4792]: I0301 09:16:00.144819 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:16:00 crc kubenswrapper[4792]: I0301 09:16:00.145376 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:16:00 crc kubenswrapper[4792]: I0301 09:16:00.148327 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:16:00 crc kubenswrapper[4792]: I0301 09:16:00.155977 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539276-sbq86"] Mar 01 09:16:00 crc kubenswrapper[4792]: I0301 09:16:00.274984 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjx6t\" (UniqueName: \"kubernetes.io/projected/0ffc24e3-2b40-4368-91a7-474239cc46fc-kube-api-access-zjx6t\") pod \"auto-csr-approver-29539276-sbq86\" (UID: \"0ffc24e3-2b40-4368-91a7-474239cc46fc\") " pod="openshift-infra/auto-csr-approver-29539276-sbq86" Mar 01 09:16:00 crc kubenswrapper[4792]: I0301 09:16:00.376768 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjx6t\" (UniqueName: \"kubernetes.io/projected/0ffc24e3-2b40-4368-91a7-474239cc46fc-kube-api-access-zjx6t\") pod \"auto-csr-approver-29539276-sbq86\" (UID: \"0ffc24e3-2b40-4368-91a7-474239cc46fc\") " pod="openshift-infra/auto-csr-approver-29539276-sbq86" Mar 01 09:16:00 crc kubenswrapper[4792]: I0301 09:16:00.395400 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjx6t\" (UniqueName: \"kubernetes.io/projected/0ffc24e3-2b40-4368-91a7-474239cc46fc-kube-api-access-zjx6t\") pod \"auto-csr-approver-29539276-sbq86\" (UID: \"0ffc24e3-2b40-4368-91a7-474239cc46fc\") " pod="openshift-infra/auto-csr-approver-29539276-sbq86" Mar 01 09:16:00 crc kubenswrapper[4792]: I0301 09:16:00.467257 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539276-sbq86" Mar 01 09:16:00 crc kubenswrapper[4792]: I0301 09:16:00.664651 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539276-sbq86"] Mar 01 09:16:00 crc kubenswrapper[4792]: W0301 09:16:00.670751 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ffc24e3_2b40_4368_91a7_474239cc46fc.slice/crio-10d4f99a6afeafcf8f1f54bf00feff9a933803d7a9ba797926dca7f642308a46 WatchSource:0}: Error finding container 10d4f99a6afeafcf8f1f54bf00feff9a933803d7a9ba797926dca7f642308a46: Status 404 returned error can't find the container with id 10d4f99a6afeafcf8f1f54bf00feff9a933803d7a9ba797926dca7f642308a46 Mar 01 09:16:01 crc kubenswrapper[4792]: I0301 09:16:01.160215 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539276-sbq86" event={"ID":"0ffc24e3-2b40-4368-91a7-474239cc46fc","Type":"ContainerStarted","Data":"10d4f99a6afeafcf8f1f54bf00feff9a933803d7a9ba797926dca7f642308a46"} Mar 01 09:16:04 crc kubenswrapper[4792]: I0301 09:16:04.176647 4792 generic.go:334] "Generic (PLEG): container finished" podID="0ffc24e3-2b40-4368-91a7-474239cc46fc" containerID="7f82c367589f33dd358f1e6a48f6206b0470bf80f8be1de16c8420482e80dba1" exitCode=0 Mar 01 09:16:04 crc kubenswrapper[4792]: I0301 09:16:04.176723 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539276-sbq86" event={"ID":"0ffc24e3-2b40-4368-91a7-474239cc46fc","Type":"ContainerDied","Data":"7f82c367589f33dd358f1e6a48f6206b0470bf80f8be1de16c8420482e80dba1"} Mar 01 09:16:04 crc kubenswrapper[4792]: I0301 09:16:04.943177 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:16:04 crc kubenswrapper[4792]: I0301 09:16:04.943227 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:16:05 crc kubenswrapper[4792]: I0301 09:16:05.441135 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539276-sbq86" Mar 01 09:16:05 crc kubenswrapper[4792]: I0301 09:16:05.540466 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjx6t\" (UniqueName: \"kubernetes.io/projected/0ffc24e3-2b40-4368-91a7-474239cc46fc-kube-api-access-zjx6t\") pod \"0ffc24e3-2b40-4368-91a7-474239cc46fc\" (UID: \"0ffc24e3-2b40-4368-91a7-474239cc46fc\") " Mar 01 09:16:05 crc kubenswrapper[4792]: I0301 09:16:05.552157 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ffc24e3-2b40-4368-91a7-474239cc46fc-kube-api-access-zjx6t" (OuterVolumeSpecName: "kube-api-access-zjx6t") pod "0ffc24e3-2b40-4368-91a7-474239cc46fc" (UID: "0ffc24e3-2b40-4368-91a7-474239cc46fc"). InnerVolumeSpecName "kube-api-access-zjx6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:16:05 crc kubenswrapper[4792]: I0301 09:16:05.641852 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjx6t\" (UniqueName: \"kubernetes.io/projected/0ffc24e3-2b40-4368-91a7-474239cc46fc-kube-api-access-zjx6t\") on node \"crc\" DevicePath \"\"" Mar 01 09:16:06 crc kubenswrapper[4792]: I0301 09:16:06.187678 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539276-sbq86" event={"ID":"0ffc24e3-2b40-4368-91a7-474239cc46fc","Type":"ContainerDied","Data":"10d4f99a6afeafcf8f1f54bf00feff9a933803d7a9ba797926dca7f642308a46"} Mar 01 09:16:06 crc kubenswrapper[4792]: I0301 09:16:06.187713 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10d4f99a6afeafcf8f1f54bf00feff9a933803d7a9ba797926dca7f642308a46" Mar 01 09:16:06 crc kubenswrapper[4792]: I0301 09:16:06.187739 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539276-sbq86" Mar 01 09:16:06 crc kubenswrapper[4792]: I0301 09:16:06.507832 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539270-q7hck"] Mar 01 09:16:06 crc kubenswrapper[4792]: I0301 09:16:06.517435 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539270-q7hck"] Mar 01 09:16:07 crc kubenswrapper[4792]: I0301 09:16:07.421046 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4130507-2de2-48c2-9c3f-e9474aeca556" path="/var/lib/kubelet/pods/b4130507-2de2-48c2-9c3f-e9474aeca556/volumes" Mar 01 09:16:21 crc kubenswrapper[4792]: I0301 09:16:21.783285 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" podUID="f147eb3a-0f65-4ecb-b1a2-5d561c21253c" containerName="registry" containerID="cri-o://0779d989a019df4f783c6ed27e1b08237b43aef3c94320d704c90c5367c173ef" gracePeriod=30 Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.192785 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.281399 4792 generic.go:334] "Generic (PLEG): container finished" podID="f147eb3a-0f65-4ecb-b1a2-5d561c21253c" containerID="0779d989a019df4f783c6ed27e1b08237b43aef3c94320d704c90c5367c173ef" exitCode=0 Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.281457 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.281473 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" event={"ID":"f147eb3a-0f65-4ecb-b1a2-5d561c21253c","Type":"ContainerDied","Data":"0779d989a019df4f783c6ed27e1b08237b43aef3c94320d704c90c5367c173ef"} Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.281506 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4jwnr" event={"ID":"f147eb3a-0f65-4ecb-b1a2-5d561c21253c","Type":"ContainerDied","Data":"21cb9ec22cbd36ee1804b1e2f96533e30f8509709de84a1859f63834b8df4b3f"} Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.281527 4792 scope.go:117] "RemoveContainer" containerID="0779d989a019df4f783c6ed27e1b08237b43aef3c94320d704c90c5367c173ef" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.293581 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-installation-pull-secrets\") pod \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.294102 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.294137 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-registry-tls\") pod \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.294160 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-bound-sa-token\") pod \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.294178 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-trusted-ca\") pod \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.294205 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k789z\" (UniqueName: \"kubernetes.io/projected/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-kube-api-access-k789z\") pod \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.294231 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-ca-trust-extracted\") pod \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.294271 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-registry-certificates\") pod \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\" (UID: \"f147eb3a-0f65-4ecb-b1a2-5d561c21253c\") " Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.296184 4792 scope.go:117] "RemoveContainer" containerID="0779d989a019df4f783c6ed27e1b08237b43aef3c94320d704c90c5367c173ef" Mar 01 09:16:22 crc kubenswrapper[4792]: E0301 09:16:22.296546 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0779d989a019df4f783c6ed27e1b08237b43aef3c94320d704c90c5367c173ef\": container with ID starting with 0779d989a019df4f783c6ed27e1b08237b43aef3c94320d704c90c5367c173ef not found: ID does not exist" containerID="0779d989a019df4f783c6ed27e1b08237b43aef3c94320d704c90c5367c173ef" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.296573 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0779d989a019df4f783c6ed27e1b08237b43aef3c94320d704c90c5367c173ef"} err="failed to get container status \"0779d989a019df4f783c6ed27e1b08237b43aef3c94320d704c90c5367c173ef\": rpc error: code = NotFound desc = could not find container \"0779d989a019df4f783c6ed27e1b08237b43aef3c94320d704c90c5367c173ef\": container with ID starting with 0779d989a019df4f783c6ed27e1b08237b43aef3c94320d704c90c5367c173ef not found: ID does not exist" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.296702 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f147eb3a-0f65-4ecb-b1a2-5d561c21253c" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.297223 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f147eb3a-0f65-4ecb-b1a2-5d561c21253c" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.309227 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "f147eb3a-0f65-4ecb-b1a2-5d561c21253c" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.309890 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-kube-api-access-k789z" (OuterVolumeSpecName: "kube-api-access-k789z") pod "f147eb3a-0f65-4ecb-b1a2-5d561c21253c" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c"). InnerVolumeSpecName "kube-api-access-k789z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.309948 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f147eb3a-0f65-4ecb-b1a2-5d561c21253c" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.309986 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f147eb3a-0f65-4ecb-b1a2-5d561c21253c" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.310069 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "f147eb3a-0f65-4ecb-b1a2-5d561c21253c" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.312347 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f147eb3a-0f65-4ecb-b1a2-5d561c21253c" (UID: "f147eb3a-0f65-4ecb-b1a2-5d561c21253c"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.397312 4792 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.397531 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.397612 4792 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.397664 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k789z\" (UniqueName: \"kubernetes.io/projected/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-kube-api-access-k789z\") on node \"crc\" DevicePath \"\"" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.397717 4792 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.397840 4792 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.397893 4792 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f147eb3a-0f65-4ecb-b1a2-5d561c21253c-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.624429 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4jwnr"] Mar 01 09:16:22 crc kubenswrapper[4792]: I0301 09:16:22.631303 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4jwnr"] Mar 01 09:16:23 crc kubenswrapper[4792]: I0301 09:16:23.416796 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f147eb3a-0f65-4ecb-b1a2-5d561c21253c" path="/var/lib/kubelet/pods/f147eb3a-0f65-4ecb-b1a2-5d561c21253c/volumes" Mar 01 09:16:34 crc kubenswrapper[4792]: I0301 09:16:34.943163 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:16:34 crc kubenswrapper[4792]: I0301 09:16:34.943703 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:16:34 crc kubenswrapper[4792]: I0301 09:16:34.943743 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:16:34 crc kubenswrapper[4792]: I0301 09:16:34.944252 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d953fb6bdd1dde8b39f7c850e5987057bc7f87ba2081744a1e12425c6ecc8289"} pod="openshift-machine-config-operator/machine-config-daemon-bqszv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 01 09:16:34 crc kubenswrapper[4792]: I0301 09:16:34.944309 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" containerID="cri-o://d953fb6bdd1dde8b39f7c850e5987057bc7f87ba2081744a1e12425c6ecc8289" gracePeriod=600 Mar 01 09:16:35 crc kubenswrapper[4792]: I0301 09:16:35.355611 4792 generic.go:334] "Generic (PLEG): container finished" podID="9105f6b0-6f16-47aa-8009-73736a90b765" containerID="d953fb6bdd1dde8b39f7c850e5987057bc7f87ba2081744a1e12425c6ecc8289" exitCode=0 Mar 01 09:16:35 crc kubenswrapper[4792]: I0301 09:16:35.355694 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerDied","Data":"d953fb6bdd1dde8b39f7c850e5987057bc7f87ba2081744a1e12425c6ecc8289"} Mar 01 09:16:35 crc kubenswrapper[4792]: I0301 09:16:35.356059 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"3d6c435219d8bc835f1f13dc6f334e1ca86fd764226b575caaadee4dad9e5209"} Mar 01 09:16:35 crc kubenswrapper[4792]: I0301 09:16:35.356102 4792 scope.go:117] "RemoveContainer" containerID="47be78f8bf99eecd992fefae44a22c4fe46a33c39ceff683cae1dcd5ee58dcac" Mar 01 09:18:00 crc kubenswrapper[4792]: I0301 09:18:00.134250 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539278-dkkbp"] Mar 01 09:18:00 crc kubenswrapper[4792]: E0301 09:18:00.135287 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ffc24e3-2b40-4368-91a7-474239cc46fc" containerName="oc" Mar 01 09:18:00 crc kubenswrapper[4792]: I0301 09:18:00.135305 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ffc24e3-2b40-4368-91a7-474239cc46fc" containerName="oc" Mar 01 09:18:00 crc kubenswrapper[4792]: E0301 09:18:00.135319 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f147eb3a-0f65-4ecb-b1a2-5d561c21253c" containerName="registry" Mar 01 09:18:00 crc kubenswrapper[4792]: I0301 09:18:00.135329 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f147eb3a-0f65-4ecb-b1a2-5d561c21253c" containerName="registry" Mar 01 09:18:00 crc kubenswrapper[4792]: I0301 09:18:00.135477 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ffc24e3-2b40-4368-91a7-474239cc46fc" containerName="oc" Mar 01 09:18:00 crc kubenswrapper[4792]: I0301 09:18:00.135489 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f147eb3a-0f65-4ecb-b1a2-5d561c21253c" containerName="registry" Mar 01 09:18:00 crc kubenswrapper[4792]: I0301 09:18:00.136003 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539278-dkkbp" Mar 01 09:18:00 crc kubenswrapper[4792]: I0301 09:18:00.140046 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539278-dkkbp"] Mar 01 09:18:00 crc kubenswrapper[4792]: I0301 09:18:00.140244 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:18:00 crc kubenswrapper[4792]: I0301 09:18:00.140501 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:18:00 crc kubenswrapper[4792]: I0301 09:18:00.140693 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:18:00 crc kubenswrapper[4792]: I0301 09:18:00.265250 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgk65\" (UniqueName: \"kubernetes.io/projected/0d17aa56-5b61-403d-9d20-cb300aabc44d-kube-api-access-wgk65\") pod \"auto-csr-approver-29539278-dkkbp\" (UID: \"0d17aa56-5b61-403d-9d20-cb300aabc44d\") " pod="openshift-infra/auto-csr-approver-29539278-dkkbp" Mar 01 09:18:00 crc kubenswrapper[4792]: I0301 09:18:00.366724 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgk65\" (UniqueName: \"kubernetes.io/projected/0d17aa56-5b61-403d-9d20-cb300aabc44d-kube-api-access-wgk65\") pod \"auto-csr-approver-29539278-dkkbp\" (UID: \"0d17aa56-5b61-403d-9d20-cb300aabc44d\") " pod="openshift-infra/auto-csr-approver-29539278-dkkbp" Mar 01 09:18:00 crc kubenswrapper[4792]: I0301 09:18:00.390617 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgk65\" (UniqueName: \"kubernetes.io/projected/0d17aa56-5b61-403d-9d20-cb300aabc44d-kube-api-access-wgk65\") pod \"auto-csr-approver-29539278-dkkbp\" (UID: \"0d17aa56-5b61-403d-9d20-cb300aabc44d\") " pod="openshift-infra/auto-csr-approver-29539278-dkkbp" Mar 01 09:18:00 crc kubenswrapper[4792]: I0301 09:18:00.461253 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539278-dkkbp" Mar 01 09:18:00 crc kubenswrapper[4792]: I0301 09:18:00.864816 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539278-dkkbp"] Mar 01 09:18:00 crc kubenswrapper[4792]: I0301 09:18:00.874983 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 01 09:18:01 crc kubenswrapper[4792]: I0301 09:18:01.860062 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539278-dkkbp" event={"ID":"0d17aa56-5b61-403d-9d20-cb300aabc44d","Type":"ContainerStarted","Data":"403087d613e3adc9d0ef98336533cbc95bef7d9cc37a9d027b28a5ec1411b85d"} Mar 01 09:18:02 crc kubenswrapper[4792]: I0301 09:18:02.870687 4792 generic.go:334] "Generic (PLEG): container finished" podID="0d17aa56-5b61-403d-9d20-cb300aabc44d" containerID="ad33205b5c6776c36f5f90bc6d51a56bb6cf073bf39f5d634c13c03da022cc95" exitCode=0 Mar 01 09:18:02 crc kubenswrapper[4792]: I0301 09:18:02.870775 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539278-dkkbp" event={"ID":"0d17aa56-5b61-403d-9d20-cb300aabc44d","Type":"ContainerDied","Data":"ad33205b5c6776c36f5f90bc6d51a56bb6cf073bf39f5d634c13c03da022cc95"} Mar 01 09:18:04 crc kubenswrapper[4792]: I0301 09:18:04.143585 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539278-dkkbp" Mar 01 09:18:04 crc kubenswrapper[4792]: I0301 09:18:04.307260 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgk65\" (UniqueName: \"kubernetes.io/projected/0d17aa56-5b61-403d-9d20-cb300aabc44d-kube-api-access-wgk65\") pod \"0d17aa56-5b61-403d-9d20-cb300aabc44d\" (UID: \"0d17aa56-5b61-403d-9d20-cb300aabc44d\") " Mar 01 09:18:04 crc kubenswrapper[4792]: I0301 09:18:04.312047 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d17aa56-5b61-403d-9d20-cb300aabc44d-kube-api-access-wgk65" (OuterVolumeSpecName: "kube-api-access-wgk65") pod "0d17aa56-5b61-403d-9d20-cb300aabc44d" (UID: "0d17aa56-5b61-403d-9d20-cb300aabc44d"). InnerVolumeSpecName "kube-api-access-wgk65". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:18:04 crc kubenswrapper[4792]: I0301 09:18:04.409155 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgk65\" (UniqueName: \"kubernetes.io/projected/0d17aa56-5b61-403d-9d20-cb300aabc44d-kube-api-access-wgk65\") on node \"crc\" DevicePath \"\"" Mar 01 09:18:04 crc kubenswrapper[4792]: I0301 09:18:04.883543 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539278-dkkbp" event={"ID":"0d17aa56-5b61-403d-9d20-cb300aabc44d","Type":"ContainerDied","Data":"403087d613e3adc9d0ef98336533cbc95bef7d9cc37a9d027b28a5ec1411b85d"} Mar 01 09:18:04 crc kubenswrapper[4792]: I0301 09:18:04.883585 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539278-dkkbp" Mar 01 09:18:04 crc kubenswrapper[4792]: I0301 09:18:04.883603 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="403087d613e3adc9d0ef98336533cbc95bef7d9cc37a9d027b28a5ec1411b85d" Mar 01 09:18:05 crc kubenswrapper[4792]: I0301 09:18:05.205109 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539272-nq8dk"] Mar 01 09:18:05 crc kubenswrapper[4792]: I0301 09:18:05.213495 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539272-nq8dk"] Mar 01 09:18:05 crc kubenswrapper[4792]: I0301 09:18:05.414906 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d574e82-f840-4f0c-982d-f6a133bd64ae" path="/var/lib/kubelet/pods/8d574e82-f840-4f0c-982d-f6a133bd64ae/volumes" Mar 01 09:18:51 crc kubenswrapper[4792]: I0301 09:18:51.787018 4792 scope.go:117] "RemoveContainer" containerID="5f51f2f66c61a102a6b43ee525bfb8b5ff9da77472d4107c4db4ba5e29f6a9ee" Mar 01 09:18:51 crc kubenswrapper[4792]: I0301 09:18:51.830050 4792 scope.go:117] "RemoveContainer" containerID="f692f356115e5b53ef6a4d81f9a4c258c05c49397508f23df7e1bd78fc94331c" Mar 01 09:19:04 crc kubenswrapper[4792]: I0301 09:19:04.943067 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:19:04 crc kubenswrapper[4792]: I0301 09:19:04.943745 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:19:34 crc kubenswrapper[4792]: I0301 09:19:34.942786 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:19:34 crc kubenswrapper[4792]: I0301 09:19:34.943387 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:20:00 crc kubenswrapper[4792]: I0301 09:20:00.140109 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539280-gz7v9"] Mar 01 09:20:00 crc kubenswrapper[4792]: E0301 09:20:00.140898 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d17aa56-5b61-403d-9d20-cb300aabc44d" containerName="oc" Mar 01 09:20:00 crc kubenswrapper[4792]: I0301 09:20:00.140935 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d17aa56-5b61-403d-9d20-cb300aabc44d" containerName="oc" Mar 01 09:20:00 crc kubenswrapper[4792]: I0301 09:20:00.141064 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d17aa56-5b61-403d-9d20-cb300aabc44d" containerName="oc" Mar 01 09:20:00 crc kubenswrapper[4792]: I0301 09:20:00.141461 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539280-gz7v9" Mar 01 09:20:00 crc kubenswrapper[4792]: I0301 09:20:00.143224 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:20:00 crc kubenswrapper[4792]: I0301 09:20:00.144245 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:20:00 crc kubenswrapper[4792]: I0301 09:20:00.144386 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:20:00 crc kubenswrapper[4792]: I0301 09:20:00.153895 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539280-gz7v9"] Mar 01 09:20:00 crc kubenswrapper[4792]: I0301 09:20:00.165095 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frqvt\" (UniqueName: \"kubernetes.io/projected/71c922d5-9de8-48d8-9f96-ad47d1d4017e-kube-api-access-frqvt\") pod \"auto-csr-approver-29539280-gz7v9\" (UID: \"71c922d5-9de8-48d8-9f96-ad47d1d4017e\") " pod="openshift-infra/auto-csr-approver-29539280-gz7v9" Mar 01 09:20:00 crc kubenswrapper[4792]: I0301 09:20:00.265981 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frqvt\" (UniqueName: \"kubernetes.io/projected/71c922d5-9de8-48d8-9f96-ad47d1d4017e-kube-api-access-frqvt\") pod \"auto-csr-approver-29539280-gz7v9\" (UID: \"71c922d5-9de8-48d8-9f96-ad47d1d4017e\") " pod="openshift-infra/auto-csr-approver-29539280-gz7v9" Mar 01 09:20:00 crc kubenswrapper[4792]: I0301 09:20:00.285862 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frqvt\" (UniqueName: \"kubernetes.io/projected/71c922d5-9de8-48d8-9f96-ad47d1d4017e-kube-api-access-frqvt\") pod \"auto-csr-approver-29539280-gz7v9\" (UID: \"71c922d5-9de8-48d8-9f96-ad47d1d4017e\") " pod="openshift-infra/auto-csr-approver-29539280-gz7v9" Mar 01 09:20:00 crc kubenswrapper[4792]: I0301 09:20:00.483024 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539280-gz7v9" Mar 01 09:20:00 crc kubenswrapper[4792]: I0301 09:20:00.683245 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539280-gz7v9"] Mar 01 09:20:01 crc kubenswrapper[4792]: I0301 09:20:01.634842 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539280-gz7v9" event={"ID":"71c922d5-9de8-48d8-9f96-ad47d1d4017e","Type":"ContainerStarted","Data":"95d34678a38dd8b8a79d0061e52054c7c7d27d8ffc5a67d6b3d18d4b9720d4e6"} Mar 01 09:20:02 crc kubenswrapper[4792]: I0301 09:20:02.642426 4792 generic.go:334] "Generic (PLEG): container finished" podID="71c922d5-9de8-48d8-9f96-ad47d1d4017e" containerID="752079600b535956d369c891a21eba391b40ef46c0f767fc3b0fcdc6ceb1bddc" exitCode=0 Mar 01 09:20:02 crc kubenswrapper[4792]: I0301 09:20:02.642502 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539280-gz7v9" event={"ID":"71c922d5-9de8-48d8-9f96-ad47d1d4017e","Type":"ContainerDied","Data":"752079600b535956d369c891a21eba391b40ef46c0f767fc3b0fcdc6ceb1bddc"} Mar 01 09:20:03 crc kubenswrapper[4792]: I0301 09:20:03.898751 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539280-gz7v9" Mar 01 09:20:03 crc kubenswrapper[4792]: I0301 09:20:03.907124 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frqvt\" (UniqueName: \"kubernetes.io/projected/71c922d5-9de8-48d8-9f96-ad47d1d4017e-kube-api-access-frqvt\") pod \"71c922d5-9de8-48d8-9f96-ad47d1d4017e\" (UID: \"71c922d5-9de8-48d8-9f96-ad47d1d4017e\") " Mar 01 09:20:03 crc kubenswrapper[4792]: I0301 09:20:03.914974 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71c922d5-9de8-48d8-9f96-ad47d1d4017e-kube-api-access-frqvt" (OuterVolumeSpecName: "kube-api-access-frqvt") pod "71c922d5-9de8-48d8-9f96-ad47d1d4017e" (UID: "71c922d5-9de8-48d8-9f96-ad47d1d4017e"). InnerVolumeSpecName "kube-api-access-frqvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:20:04 crc kubenswrapper[4792]: I0301 09:20:04.008521 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frqvt\" (UniqueName: \"kubernetes.io/projected/71c922d5-9de8-48d8-9f96-ad47d1d4017e-kube-api-access-frqvt\") on node \"crc\" DevicePath \"\"" Mar 01 09:20:04 crc kubenswrapper[4792]: I0301 09:20:04.657382 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539280-gz7v9" event={"ID":"71c922d5-9de8-48d8-9f96-ad47d1d4017e","Type":"ContainerDied","Data":"95d34678a38dd8b8a79d0061e52054c7c7d27d8ffc5a67d6b3d18d4b9720d4e6"} Mar 01 09:20:04 crc kubenswrapper[4792]: I0301 09:20:04.657827 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95d34678a38dd8b8a79d0061e52054c7c7d27d8ffc5a67d6b3d18d4b9720d4e6" Mar 01 09:20:04 crc kubenswrapper[4792]: I0301 09:20:04.657465 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539280-gz7v9" Mar 01 09:20:04 crc kubenswrapper[4792]: I0301 09:20:04.943391 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:20:04 crc kubenswrapper[4792]: I0301 09:20:04.943466 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:20:04 crc kubenswrapper[4792]: I0301 09:20:04.943524 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:20:04 crc kubenswrapper[4792]: I0301 09:20:04.944398 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3d6c435219d8bc835f1f13dc6f334e1ca86fd764226b575caaadee4dad9e5209"} pod="openshift-machine-config-operator/machine-config-daemon-bqszv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 01 09:20:04 crc kubenswrapper[4792]: I0301 09:20:04.944486 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" containerID="cri-o://3d6c435219d8bc835f1f13dc6f334e1ca86fd764226b575caaadee4dad9e5209" gracePeriod=600 Mar 01 09:20:04 crc kubenswrapper[4792]: I0301 09:20:04.987958 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539274-zckv2"] Mar 01 09:20:04 crc kubenswrapper[4792]: I0301 09:20:04.997859 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539274-zckv2"] Mar 01 09:20:05 crc kubenswrapper[4792]: I0301 09:20:05.427111 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81a3bf03-822b-4b69-93a3-b420d8f58efd" path="/var/lib/kubelet/pods/81a3bf03-822b-4b69-93a3-b420d8f58efd/volumes" Mar 01 09:20:05 crc kubenswrapper[4792]: I0301 09:20:05.664514 4792 generic.go:334] "Generic (PLEG): container finished" podID="9105f6b0-6f16-47aa-8009-73736a90b765" containerID="3d6c435219d8bc835f1f13dc6f334e1ca86fd764226b575caaadee4dad9e5209" exitCode=0 Mar 01 09:20:05 crc kubenswrapper[4792]: I0301 09:20:05.664563 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerDied","Data":"3d6c435219d8bc835f1f13dc6f334e1ca86fd764226b575caaadee4dad9e5209"} Mar 01 09:20:05 crc kubenswrapper[4792]: I0301 09:20:05.664590 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"f223d76bf80d673696c61fa09c13cc363c202a24d360c9e2da1e52c335e05521"} Mar 01 09:20:05 crc kubenswrapper[4792]: I0301 09:20:05.664607 4792 scope.go:117] "RemoveContainer" containerID="d953fb6bdd1dde8b39f7c850e5987057bc7f87ba2081744a1e12425c6ecc8289" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.718001 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-tm5s6"] Mar 01 09:20:48 crc kubenswrapper[4792]: E0301 09:20:48.718807 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71c922d5-9de8-48d8-9f96-ad47d1d4017e" containerName="oc" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.718822 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="71c922d5-9de8-48d8-9f96-ad47d1d4017e" containerName="oc" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.718967 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="71c922d5-9de8-48d8-9f96-ad47d1d4017e" containerName="oc" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.719410 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-tm5s6" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.726575 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.727025 4792 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-bz5ls" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.727327 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.729689 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-tm5s6"] Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.735415 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-4qgsm"] Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.740980 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-4qgsm" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.749460 4792 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-njvxn" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.758720 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-4qgsm"] Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.781513 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-rckpb"] Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.782365 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-rckpb" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.790404 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6tgd\" (UniqueName: \"kubernetes.io/projected/bf71ada0-c7b2-4255-bb2c-31ec3309a29d-kube-api-access-z6tgd\") pod \"cert-manager-858654f9db-4qgsm\" (UID: \"bf71ada0-c7b2-4255-bb2c-31ec3309a29d\") " pod="cert-manager/cert-manager-858654f9db-4qgsm" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.790470 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fqhx\" (UniqueName: \"kubernetes.io/projected/2071887a-31a9-428d-92d0-bf8a361011ca-kube-api-access-9fqhx\") pod \"cert-manager-cainjector-cf98fcc89-tm5s6\" (UID: \"2071887a-31a9-428d-92d0-bf8a361011ca\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-tm5s6" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.791011 4792 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-dc8jw" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.796315 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-rckpb"] Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.891260 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2vcr\" (UniqueName: \"kubernetes.io/projected/a03eedd4-ecde-4905-95a7-c43b45ef9da9-kube-api-access-g2vcr\") pod \"cert-manager-webhook-687f57d79b-rckpb\" (UID: \"a03eedd4-ecde-4905-95a7-c43b45ef9da9\") " pod="cert-manager/cert-manager-webhook-687f57d79b-rckpb" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.891510 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6tgd\" (UniqueName: \"kubernetes.io/projected/bf71ada0-c7b2-4255-bb2c-31ec3309a29d-kube-api-access-z6tgd\") pod \"cert-manager-858654f9db-4qgsm\" (UID: \"bf71ada0-c7b2-4255-bb2c-31ec3309a29d\") " pod="cert-manager/cert-manager-858654f9db-4qgsm" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.891635 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fqhx\" (UniqueName: \"kubernetes.io/projected/2071887a-31a9-428d-92d0-bf8a361011ca-kube-api-access-9fqhx\") pod \"cert-manager-cainjector-cf98fcc89-tm5s6\" (UID: \"2071887a-31a9-428d-92d0-bf8a361011ca\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-tm5s6" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.915754 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fqhx\" (UniqueName: \"kubernetes.io/projected/2071887a-31a9-428d-92d0-bf8a361011ca-kube-api-access-9fqhx\") pod \"cert-manager-cainjector-cf98fcc89-tm5s6\" (UID: \"2071887a-31a9-428d-92d0-bf8a361011ca\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-tm5s6" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.922670 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6tgd\" (UniqueName: \"kubernetes.io/projected/bf71ada0-c7b2-4255-bb2c-31ec3309a29d-kube-api-access-z6tgd\") pod \"cert-manager-858654f9db-4qgsm\" (UID: \"bf71ada0-c7b2-4255-bb2c-31ec3309a29d\") " pod="cert-manager/cert-manager-858654f9db-4qgsm" Mar 01 09:20:48 crc kubenswrapper[4792]: I0301 09:20:48.992715 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2vcr\" (UniqueName: \"kubernetes.io/projected/a03eedd4-ecde-4905-95a7-c43b45ef9da9-kube-api-access-g2vcr\") pod \"cert-manager-webhook-687f57d79b-rckpb\" (UID: \"a03eedd4-ecde-4905-95a7-c43b45ef9da9\") " pod="cert-manager/cert-manager-webhook-687f57d79b-rckpb" Mar 01 09:20:49 crc kubenswrapper[4792]: I0301 09:20:49.008171 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2vcr\" (UniqueName: \"kubernetes.io/projected/a03eedd4-ecde-4905-95a7-c43b45ef9da9-kube-api-access-g2vcr\") pod \"cert-manager-webhook-687f57d79b-rckpb\" (UID: \"a03eedd4-ecde-4905-95a7-c43b45ef9da9\") " pod="cert-manager/cert-manager-webhook-687f57d79b-rckpb" Mar 01 09:20:49 crc kubenswrapper[4792]: I0301 09:20:49.053011 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-tm5s6" Mar 01 09:20:49 crc kubenswrapper[4792]: I0301 09:20:49.073215 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-4qgsm" Mar 01 09:20:49 crc kubenswrapper[4792]: I0301 09:20:49.094144 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-rckpb" Mar 01 09:20:49 crc kubenswrapper[4792]: I0301 09:20:49.493779 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-tm5s6"] Mar 01 09:20:49 crc kubenswrapper[4792]: I0301 09:20:49.497779 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-4qgsm"] Mar 01 09:20:49 crc kubenswrapper[4792]: W0301 09:20:49.510542 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2071887a_31a9_428d_92d0_bf8a361011ca.slice/crio-00323368cd944ded4fac2cb52d8703ab37cd9a849120bafe9a11659e90d485a3 WatchSource:0}: Error finding container 00323368cd944ded4fac2cb52d8703ab37cd9a849120bafe9a11659e90d485a3: Status 404 returned error can't find the container with id 00323368cd944ded4fac2cb52d8703ab37cd9a849120bafe9a11659e90d485a3 Mar 01 09:20:49 crc kubenswrapper[4792]: I0301 09:20:49.557496 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-rckpb"] Mar 01 09:20:49 crc kubenswrapper[4792]: I0301 09:20:49.688001 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-rckpb" event={"ID":"a03eedd4-ecde-4905-95a7-c43b45ef9da9","Type":"ContainerStarted","Data":"47449944f9bdaccbb20eb4038f0bd25ccb1bd8c02d38d7e9132edb7737008e09"} Mar 01 09:20:49 crc kubenswrapper[4792]: I0301 09:20:49.689271 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-4qgsm" event={"ID":"bf71ada0-c7b2-4255-bb2c-31ec3309a29d","Type":"ContainerStarted","Data":"8497a59c82276de798f225f8a47dafc1a3d068d0592c3a9073082c3b2de27d7b"} Mar 01 09:20:49 crc kubenswrapper[4792]: I0301 09:20:49.690307 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-tm5s6" event={"ID":"2071887a-31a9-428d-92d0-bf8a361011ca","Type":"ContainerStarted","Data":"00323368cd944ded4fac2cb52d8703ab37cd9a849120bafe9a11659e90d485a3"} Mar 01 09:20:51 crc kubenswrapper[4792]: I0301 09:20:51.898283 4792 scope.go:117] "RemoveContainer" containerID="8ef57da9b21fb114ba0f54a09e6174de667175684b505b54b0c846389388b402" Mar 01 09:20:53 crc kubenswrapper[4792]: I0301 09:20:53.711643 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-tm5s6" event={"ID":"2071887a-31a9-428d-92d0-bf8a361011ca","Type":"ContainerStarted","Data":"37e2f9809f32423b862ed39deb790929e382850f002f42f2a85d369a95317f6e"} Mar 01 09:20:53 crc kubenswrapper[4792]: I0301 09:20:53.713172 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-rckpb" event={"ID":"a03eedd4-ecde-4905-95a7-c43b45ef9da9","Type":"ContainerStarted","Data":"31faf93bab9a2d4d492d2210a05a44e8bab4c57026a929527668b7edcccfd4ce"} Mar 01 09:20:53 crc kubenswrapper[4792]: I0301 09:20:53.713281 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-rckpb" Mar 01 09:20:53 crc kubenswrapper[4792]: I0301 09:20:53.713965 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-4qgsm" event={"ID":"bf71ada0-c7b2-4255-bb2c-31ec3309a29d","Type":"ContainerStarted","Data":"909a22d7f23369670e6fa7fc2dfa31a5f93d0d6e7233311d411b3b32275ba42e"} Mar 01 09:20:53 crc kubenswrapper[4792]: I0301 09:20:53.725421 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-tm5s6" podStartSLOduration=2.02458537 podStartE2EDuration="5.725402698s" podCreationTimestamp="2026-03-01 09:20:48 +0000 UTC" firstStartedPulling="2026-03-01 09:20:49.512298049 +0000 UTC m=+778.754177286" lastFinishedPulling="2026-03-01 09:20:53.213115417 +0000 UTC m=+782.454994614" observedRunningTime="2026-03-01 09:20:53.724860254 +0000 UTC m=+782.966739461" watchObservedRunningTime="2026-03-01 09:20:53.725402698 +0000 UTC m=+782.967281895" Mar 01 09:20:53 crc kubenswrapper[4792]: I0301 09:20:53.742270 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-rckpb" podStartSLOduration=2.036089143 podStartE2EDuration="5.742251663s" podCreationTimestamp="2026-03-01 09:20:48 +0000 UTC" firstStartedPulling="2026-03-01 09:20:49.561143395 +0000 UTC m=+778.803022592" lastFinishedPulling="2026-03-01 09:20:53.267305875 +0000 UTC m=+782.509185112" observedRunningTime="2026-03-01 09:20:53.741564586 +0000 UTC m=+782.983443783" watchObservedRunningTime="2026-03-01 09:20:53.742251663 +0000 UTC m=+782.984130860" Mar 01 09:20:53 crc kubenswrapper[4792]: I0301 09:20:53.763106 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-4qgsm" podStartSLOduration=2.060401783 podStartE2EDuration="5.763088107s" podCreationTimestamp="2026-03-01 09:20:48 +0000 UTC" firstStartedPulling="2026-03-01 09:20:49.510008163 +0000 UTC m=+778.751887360" lastFinishedPulling="2026-03-01 09:20:53.212694487 +0000 UTC m=+782.454573684" observedRunningTime="2026-03-01 09:20:53.75955078 +0000 UTC m=+783.001429977" watchObservedRunningTime="2026-03-01 09:20:53.763088107 +0000 UTC m=+783.004967304" Mar 01 09:20:59 crc kubenswrapper[4792]: I0301 09:20:59.098333 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-rckpb" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.273091 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7pp7m"] Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.274000 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovn-controller" containerID="cri-o://d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948" gracePeriod=30 Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.274135 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="sbdb" containerID="cri-o://63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0" gracePeriod=30 Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.274168 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="nbdb" containerID="cri-o://ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe" gracePeriod=30 Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.274196 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="northd" containerID="cri-o://0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e" gracePeriod=30 Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.274224 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8" gracePeriod=30 Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.274250 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="kube-rbac-proxy-node" containerID="cri-o://5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3" gracePeriod=30 Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.274277 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovn-acl-logging" containerID="cri-o://0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a" gracePeriod=30 Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.370959 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovnkube-controller" containerID="cri-o://9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788" gracePeriod=30 Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.627928 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovnkube-controller/3.log" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.631692 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovn-acl-logging/0.log" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.632258 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovn-controller/0.log" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.632719 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687486 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bv7mw"] Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.687685 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="kubecfg-setup" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687698 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="kubecfg-setup" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.687708 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovnkube-controller" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687714 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovnkube-controller" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.687725 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="kube-rbac-proxy-node" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687732 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="kube-rbac-proxy-node" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.687742 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="sbdb" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687748 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="sbdb" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.687756 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="kube-rbac-proxy-ovn-metrics" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687763 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="kube-rbac-proxy-ovn-metrics" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.687770 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="northd" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687775 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="northd" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.687782 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovnkube-controller" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687788 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovnkube-controller" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.687795 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovn-controller" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687801 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovn-controller" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.687809 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovnkube-controller" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687815 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovnkube-controller" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.687825 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovn-acl-logging" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687831 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovn-acl-logging" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.687839 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="nbdb" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687845 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="nbdb" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687939 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="kube-rbac-proxy-node" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687949 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="sbdb" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687959 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovnkube-controller" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687964 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="kube-rbac-proxy-ovn-metrics" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687971 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovn-controller" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687980 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="nbdb" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687989 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovn-acl-logging" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.687997 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="northd" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.688004 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovnkube-controller" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.688010 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovnkube-controller" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.688103 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovnkube-controller" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.688109 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovnkube-controller" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.688119 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovnkube-controller" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.688125 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovnkube-controller" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.688585 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovnkube-controller" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.688602 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerName="ovnkube-controller" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.696309 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.762669 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-systemd-units\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.762733 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqvlh\" (UniqueName: \"kubernetes.io/projected/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-kube-api-access-kqvlh\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.762781 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-kubelet\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.762804 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-cni-bin\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.762829 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-run-openvswitch\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.762863 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-run-ovn-kubernetes\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.762875 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-run-systemd\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.762890 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-ovnkube-config\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.762928 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-cni-netd\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.762944 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-slash\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.762961 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-node-log\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.762977 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-ovnkube-script-lib\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763004 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-env-overrides\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763033 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763057 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-log-socket\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763101 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-run-ovn\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763120 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-run-netns\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763136 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-var-lib-openvswitch\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763150 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-etc-openvswitch\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763173 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-ovn-node-metrics-cert\") pod \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\" (UID: \"e2bd7bac-21cf-4657-ab84-68a14f99f8f0\") " Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763285 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-run-systemd\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763304 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-run-openvswitch\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763320 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-log-socket\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763338 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-run-netns\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763363 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-etc-openvswitch\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763379 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fa26d667-5bcd-4849-b79d-e47e08e703d1-ovnkube-config\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763402 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-run-ovn-kubernetes\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763427 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-run-ovn\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763443 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa26d667-5bcd-4849-b79d-e47e08e703d1-ovn-node-metrics-cert\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763461 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b8fv\" (UniqueName: \"kubernetes.io/projected/fa26d667-5bcd-4849-b79d-e47e08e703d1-kube-api-access-8b8fv\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763476 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-var-lib-openvswitch\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763501 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-cni-bin\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763516 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fa26d667-5bcd-4849-b79d-e47e08e703d1-env-overrides\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763537 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-cni-netd\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763557 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763576 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-kubelet\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763595 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fa26d667-5bcd-4849-b79d-e47e08e703d1-ovnkube-script-lib\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763612 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-systemd-units\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763624 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-node-log\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763641 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-slash\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763728 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.763975 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.764141 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.764147 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.764156 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.764207 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.764243 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-log-socket" (OuterVolumeSpecName: "log-socket") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.764260 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.764276 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.764289 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.764313 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.764342 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.764378 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-slash" (OuterVolumeSpecName: "host-slash") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.764402 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-node-log" (OuterVolumeSpecName: "node-log") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.764667 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.764656 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.764881 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.769492 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-kube-api-access-kqvlh" (OuterVolumeSpecName: "kube-api-access-kqvlh") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "kube-api-access-kqvlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.769648 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.776800 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "e2bd7bac-21cf-4657-ab84-68a14f99f8f0" (UID: "e2bd7bac-21cf-4657-ab84-68a14f99f8f0"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.813205 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pq28p_ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3/kube-multus/2.log" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.813692 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pq28p_ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3/kube-multus/1.log" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.813741 4792 generic.go:334] "Generic (PLEG): container finished" podID="ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3" containerID="43e6bc214d9cf4f260a6b3f92b2bb7a5207d98f68c3fc275ce08d07a9684d65b" exitCode=2 Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.813812 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pq28p" event={"ID":"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3","Type":"ContainerDied","Data":"43e6bc214d9cf4f260a6b3f92b2bb7a5207d98f68c3fc275ce08d07a9684d65b"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.813855 4792 scope.go:117] "RemoveContainer" containerID="833126c3956c8927e1b68252bd9962e43df9c6e09dc2b98a20208c2db19a5fc1" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.814472 4792 scope.go:117] "RemoveContainer" containerID="43e6bc214d9cf4f260a6b3f92b2bb7a5207d98f68c3fc275ce08d07a9684d65b" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.814838 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-pq28p_openshift-multus(ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3)\"" pod="openshift-multus/multus-pq28p" podUID="ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.817432 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovnkube-controller/3.log" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.820775 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovn-acl-logging/0.log" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821221 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pp7m_e2bd7bac-21cf-4657-ab84-68a14f99f8f0/ovn-controller/0.log" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821514 4792 generic.go:334] "Generic (PLEG): container finished" podID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerID="9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788" exitCode=0 Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821536 4792 generic.go:334] "Generic (PLEG): container finished" podID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerID="63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0" exitCode=0 Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821544 4792 generic.go:334] "Generic (PLEG): container finished" podID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerID="ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe" exitCode=0 Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821555 4792 generic.go:334] "Generic (PLEG): container finished" podID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerID="0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e" exitCode=0 Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821562 4792 generic.go:334] "Generic (PLEG): container finished" podID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerID="05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8" exitCode=0 Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821569 4792 generic.go:334] "Generic (PLEG): container finished" podID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerID="5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3" exitCode=0 Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821575 4792 generic.go:334] "Generic (PLEG): container finished" podID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerID="0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a" exitCode=143 Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821581 4792 generic.go:334] "Generic (PLEG): container finished" podID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" containerID="d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948" exitCode=143 Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821609 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerDied","Data":"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821635 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerDied","Data":"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821646 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerDied","Data":"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821655 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerDied","Data":"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821667 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerDied","Data":"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821676 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerDied","Data":"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821687 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821696 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821702 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821707 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821712 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821716 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821721 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821726 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821731 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821735 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821742 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerDied","Data":"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821750 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821756 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821762 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821767 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821772 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821777 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821782 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821788 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821792 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821798 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821805 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerDied","Data":"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821812 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821818 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821823 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821828 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821834 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821838 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821844 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821849 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821854 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821859 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821865 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" event={"ID":"e2bd7bac-21cf-4657-ab84-68a14f99f8f0","Type":"ContainerDied","Data":"50a5a13eb582ab0332ca2180448f447182fdf584f81ede3ccf1a5ef0fe6bed57"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821873 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821882 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821888 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821894 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821899 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821936 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821942 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821947 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821952 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.821957 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6"} Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.822069 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7pp7m" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.842077 4792 scope.go:117] "RemoveContainer" containerID="9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.862724 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7pp7m"] Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864459 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fa26d667-5bcd-4849-b79d-e47e08e703d1-ovnkube-script-lib\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864528 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-systemd-units\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864560 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-node-log\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864593 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-slash\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864621 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-run-systemd\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864646 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-run-openvswitch\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864664 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-systemd-units\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864678 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-log-socket\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864664 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-node-log\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864735 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-run-netns\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864748 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-run-systemd\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864766 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-run-openvswitch\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864772 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-slash\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864791 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-run-netns\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864801 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-etc-openvswitch\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864846 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fa26d667-5bcd-4849-b79d-e47e08e703d1-ovnkube-config\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864736 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-log-socket\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864894 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-run-ovn-kubernetes\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864974 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b8fv\" (UniqueName: \"kubernetes.io/projected/fa26d667-5bcd-4849-b79d-e47e08e703d1-kube-api-access-8b8fv\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865005 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-run-ovn\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865031 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa26d667-5bcd-4849-b79d-e47e08e703d1-ovn-node-metrics-cert\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865049 4792 scope.go:117] "RemoveContainer" containerID="157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865076 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-var-lib-openvswitch\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.864818 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-etc-openvswitch\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865053 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-var-lib-openvswitch\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865120 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-run-ovn-kubernetes\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865168 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-cni-bin\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865187 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fa26d667-5bcd-4849-b79d-e47e08e703d1-env-overrides\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865222 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-cni-netd\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865261 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865282 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-kubelet\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865330 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fa26d667-5bcd-4849-b79d-e47e08e703d1-ovnkube-script-lib\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865355 4792 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865366 4792 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865399 4792 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865409 4792 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865418 4792 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-slash\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865427 4792 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865443 4792 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-node-log\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865454 4792 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865464 4792 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865470 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-run-ovn\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865442 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fa26d667-5bcd-4849-b79d-e47e08e703d1-ovnkube-config\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865475 4792 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-log-socket\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865503 4792 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865511 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-cni-netd\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865513 4792 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865532 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-cni-bin\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865558 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865575 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa26d667-5bcd-4849-b79d-e47e08e703d1-host-kubelet\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865590 4792 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865599 4792 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865608 4792 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865626 4792 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865635 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqvlh\" (UniqueName: \"kubernetes.io/projected/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-kube-api-access-kqvlh\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865644 4792 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865652 4792 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865661 4792 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e2bd7bac-21cf-4657-ab84-68a14f99f8f0-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.865777 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fa26d667-5bcd-4849-b79d-e47e08e703d1-env-overrides\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.868229 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7pp7m"] Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.868497 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa26d667-5bcd-4849-b79d-e47e08e703d1-ovn-node-metrics-cert\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.881722 4792 scope.go:117] "RemoveContainer" containerID="63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.882972 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b8fv\" (UniqueName: \"kubernetes.io/projected/fa26d667-5bcd-4849-b79d-e47e08e703d1-kube-api-access-8b8fv\") pod \"ovnkube-node-bv7mw\" (UID: \"fa26d667-5bcd-4849-b79d-e47e08e703d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.894704 4792 scope.go:117] "RemoveContainer" containerID="ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.905511 4792 scope.go:117] "RemoveContainer" containerID="0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.917414 4792 scope.go:117] "RemoveContainer" containerID="05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.929886 4792 scope.go:117] "RemoveContainer" containerID="5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.942326 4792 scope.go:117] "RemoveContainer" containerID="0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.956203 4792 scope.go:117] "RemoveContainer" containerID="d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.970628 4792 scope.go:117] "RemoveContainer" containerID="d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.988219 4792 scope.go:117] "RemoveContainer" containerID="9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.989603 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788\": container with ID starting with 9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788 not found: ID does not exist" containerID="9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.989713 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788"} err="failed to get container status \"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788\": rpc error: code = NotFound desc = could not find container \"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788\": container with ID starting with 9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788 not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.989800 4792 scope.go:117] "RemoveContainer" containerID="157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.990510 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c\": container with ID starting with 157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c not found: ID does not exist" containerID="157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.990659 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c"} err="failed to get container status \"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c\": rpc error: code = NotFound desc = could not find container \"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c\": container with ID starting with 157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.990710 4792 scope.go:117] "RemoveContainer" containerID="63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.991409 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\": container with ID starting with 63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0 not found: ID does not exist" containerID="63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.991434 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0"} err="failed to get container status \"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\": rpc error: code = NotFound desc = could not find container \"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\": container with ID starting with 63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0 not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.991457 4792 scope.go:117] "RemoveContainer" containerID="ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.991898 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\": container with ID starting with ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe not found: ID does not exist" containerID="ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.991964 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe"} err="failed to get container status \"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\": rpc error: code = NotFound desc = could not find container \"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\": container with ID starting with ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.992010 4792 scope.go:117] "RemoveContainer" containerID="0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.992405 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\": container with ID starting with 0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e not found: ID does not exist" containerID="0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.992445 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e"} err="failed to get container status \"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\": rpc error: code = NotFound desc = could not find container \"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\": container with ID starting with 0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.992463 4792 scope.go:117] "RemoveContainer" containerID="05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.992891 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\": container with ID starting with 05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8 not found: ID does not exist" containerID="05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.992984 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8"} err="failed to get container status \"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\": rpc error: code = NotFound desc = could not find container \"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\": container with ID starting with 05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8 not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.993046 4792 scope.go:117] "RemoveContainer" containerID="5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.993462 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\": container with ID starting with 5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3 not found: ID does not exist" containerID="5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.993515 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3"} err="failed to get container status \"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\": rpc error: code = NotFound desc = could not find container \"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\": container with ID starting with 5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3 not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.993538 4792 scope.go:117] "RemoveContainer" containerID="0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.993876 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\": container with ID starting with 0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a not found: ID does not exist" containerID="0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.993982 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a"} err="failed to get container status \"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\": rpc error: code = NotFound desc = could not find container \"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\": container with ID starting with 0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.994090 4792 scope.go:117] "RemoveContainer" containerID="d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.994584 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\": container with ID starting with d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948 not found: ID does not exist" containerID="d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.994624 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948"} err="failed to get container status \"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\": rpc error: code = NotFound desc = could not find container \"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\": container with ID starting with d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948 not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.994642 4792 scope.go:117] "RemoveContainer" containerID="d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6" Mar 01 09:21:07 crc kubenswrapper[4792]: E0301 09:21:07.994998 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\": container with ID starting with d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6 not found: ID does not exist" containerID="d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.995022 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6"} err="failed to get container status \"d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\": rpc error: code = NotFound desc = could not find container \"d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\": container with ID starting with d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6 not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.995038 4792 scope.go:117] "RemoveContainer" containerID="9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.995306 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788"} err="failed to get container status \"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788\": rpc error: code = NotFound desc = could not find container \"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788\": container with ID starting with 9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788 not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.995332 4792 scope.go:117] "RemoveContainer" containerID="157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.995567 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c"} err="failed to get container status \"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c\": rpc error: code = NotFound desc = could not find container \"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c\": container with ID starting with 157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.995588 4792 scope.go:117] "RemoveContainer" containerID="63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.995987 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0"} err="failed to get container status \"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\": rpc error: code = NotFound desc = could not find container \"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\": container with ID starting with 63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0 not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.996008 4792 scope.go:117] "RemoveContainer" containerID="ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.996273 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe"} err="failed to get container status \"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\": rpc error: code = NotFound desc = could not find container \"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\": container with ID starting with ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.996293 4792 scope.go:117] "RemoveContainer" containerID="0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.996644 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e"} err="failed to get container status \"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\": rpc error: code = NotFound desc = could not find container \"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\": container with ID starting with 0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.996665 4792 scope.go:117] "RemoveContainer" containerID="05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.996850 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8"} err="failed to get container status \"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\": rpc error: code = NotFound desc = could not find container \"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\": container with ID starting with 05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8 not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.996871 4792 scope.go:117] "RemoveContainer" containerID="5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.997220 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3"} err="failed to get container status \"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\": rpc error: code = NotFound desc = could not find container \"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\": container with ID starting with 5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3 not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.997243 4792 scope.go:117] "RemoveContainer" containerID="0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.997520 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a"} err="failed to get container status \"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\": rpc error: code = NotFound desc = could not find container \"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\": container with ID starting with 0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.997542 4792 scope.go:117] "RemoveContainer" containerID="d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.997819 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948"} err="failed to get container status \"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\": rpc error: code = NotFound desc = could not find container \"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\": container with ID starting with d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948 not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.997860 4792 scope.go:117] "RemoveContainer" containerID="d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.998342 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6"} err="failed to get container status \"d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\": rpc error: code = NotFound desc = could not find container \"d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\": container with ID starting with d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6 not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.998365 4792 scope.go:117] "RemoveContainer" containerID="9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.998583 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788"} err="failed to get container status \"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788\": rpc error: code = NotFound desc = could not find container \"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788\": container with ID starting with 9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788 not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.998606 4792 scope.go:117] "RemoveContainer" containerID="157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.998803 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c"} err="failed to get container status \"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c\": rpc error: code = NotFound desc = could not find container \"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c\": container with ID starting with 157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.998836 4792 scope.go:117] "RemoveContainer" containerID="63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.999234 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0"} err="failed to get container status \"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\": rpc error: code = NotFound desc = could not find container \"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\": container with ID starting with 63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0 not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.999258 4792 scope.go:117] "RemoveContainer" containerID="ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.999509 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe"} err="failed to get container status \"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\": rpc error: code = NotFound desc = could not find container \"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\": container with ID starting with ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.999529 4792 scope.go:117] "RemoveContainer" containerID="0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.999871 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e"} err="failed to get container status \"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\": rpc error: code = NotFound desc = could not find container \"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\": container with ID starting with 0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e not found: ID does not exist" Mar 01 09:21:07 crc kubenswrapper[4792]: I0301 09:21:07.999895 4792 scope.go:117] "RemoveContainer" containerID="05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.001290 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8"} err="failed to get container status \"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\": rpc error: code = NotFound desc = could not find container \"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\": container with ID starting with 05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8 not found: ID does not exist" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.001313 4792 scope.go:117] "RemoveContainer" containerID="5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.001681 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3"} err="failed to get container status \"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\": rpc error: code = NotFound desc = could not find container \"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\": container with ID starting with 5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3 not found: ID does not exist" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.001708 4792 scope.go:117] "RemoveContainer" containerID="0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.001980 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a"} err="failed to get container status \"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\": rpc error: code = NotFound desc = could not find container \"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\": container with ID starting with 0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a not found: ID does not exist" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.002007 4792 scope.go:117] "RemoveContainer" containerID="d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.002377 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948"} err="failed to get container status \"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\": rpc error: code = NotFound desc = could not find container \"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\": container with ID starting with d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948 not found: ID does not exist" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.002399 4792 scope.go:117] "RemoveContainer" containerID="d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.002681 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6"} err="failed to get container status \"d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\": rpc error: code = NotFound desc = could not find container \"d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\": container with ID starting with d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6 not found: ID does not exist" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.002709 4792 scope.go:117] "RemoveContainer" containerID="9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.002989 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788"} err="failed to get container status \"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788\": rpc error: code = NotFound desc = could not find container \"9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788\": container with ID starting with 9e7b9630390f7553858c5ffe0e20b23f20673df873081a103c784cb5ba209788 not found: ID does not exist" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.003013 4792 scope.go:117] "RemoveContainer" containerID="157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.003201 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c"} err="failed to get container status \"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c\": rpc error: code = NotFound desc = could not find container \"157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c\": container with ID starting with 157d67fde13133a3772ecf3043bcbe41768bc5a4312dd724ab00de4d374cb93c not found: ID does not exist" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.003225 4792 scope.go:117] "RemoveContainer" containerID="63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.004272 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0"} err="failed to get container status \"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\": rpc error: code = NotFound desc = could not find container \"63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0\": container with ID starting with 63e7b3d21974364839223b560d89372f19782292c82ffa5b9f3651307e63e8a0 not found: ID does not exist" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.004295 4792 scope.go:117] "RemoveContainer" containerID="ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.004742 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe"} err="failed to get container status \"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\": rpc error: code = NotFound desc = could not find container \"ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe\": container with ID starting with ce70eebae9cc6939b048bef27b2505b86fccdcf4feeddde4d3a3d2027093ffbe not found: ID does not exist" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.004773 4792 scope.go:117] "RemoveContainer" containerID="0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.005199 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e"} err="failed to get container status \"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\": rpc error: code = NotFound desc = could not find container \"0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e\": container with ID starting with 0a5afc25cb95762223458b4bc916095d1c8b0895178f4b879e0888144106623e not found: ID does not exist" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.005229 4792 scope.go:117] "RemoveContainer" containerID="05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.005457 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8"} err="failed to get container status \"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\": rpc error: code = NotFound desc = could not find container \"05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8\": container with ID starting with 05e1d59cbc5e649fad34bba9d1b0e55835e3af16cd8c8c9b166b0245dc600da8 not found: ID does not exist" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.005478 4792 scope.go:117] "RemoveContainer" containerID="5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.006048 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3"} err="failed to get container status \"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\": rpc error: code = NotFound desc = could not find container \"5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3\": container with ID starting with 5075f2983a6cbb666bfd052754cfae30f6e643f09f86633a68a62eb348f660e3 not found: ID does not exist" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.006069 4792 scope.go:117] "RemoveContainer" containerID="0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.006316 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a"} err="failed to get container status \"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\": rpc error: code = NotFound desc = could not find container \"0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a\": container with ID starting with 0d88d56713bee465a6950b4f514ef277e193bfead70f6a14f787c1da3ea9fd7a not found: ID does not exist" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.006337 4792 scope.go:117] "RemoveContainer" containerID="d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.006872 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948"} err="failed to get container status \"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\": rpc error: code = NotFound desc = could not find container \"d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948\": container with ID starting with d9646fad310b6edea9fa29ec4c4ef4151ce4a0bad6961be70f04ea8ac014e948 not found: ID does not exist" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.006896 4792 scope.go:117] "RemoveContainer" containerID="d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.007257 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6"} err="failed to get container status \"d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\": rpc error: code = NotFound desc = could not find container \"d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6\": container with ID starting with d41ffd76f197ba47f7502bf4fae77d8ec0970fc0516ed092ba0c21c03aefaec6 not found: ID does not exist" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.020948 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:08 crc kubenswrapper[4792]: W0301 09:21:08.043931 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa26d667_5bcd_4849_b79d_e47e08e703d1.slice/crio-fe65abd025369aebfe45a140a086bc8b6b2e24c306998db6e63f2381f1ac688e WatchSource:0}: Error finding container fe65abd025369aebfe45a140a086bc8b6b2e24c306998db6e63f2381f1ac688e: Status 404 returned error can't find the container with id fe65abd025369aebfe45a140a086bc8b6b2e24c306998db6e63f2381f1ac688e Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.829491 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pq28p_ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3/kube-multus/2.log" Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.832252 4792 generic.go:334] "Generic (PLEG): container finished" podID="fa26d667-5bcd-4849-b79d-e47e08e703d1" containerID="6ad83038e4755fd9a4617fe948f19c9667dee0891394f503d180010493494e46" exitCode=0 Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.832288 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" event={"ID":"fa26d667-5bcd-4849-b79d-e47e08e703d1","Type":"ContainerDied","Data":"6ad83038e4755fd9a4617fe948f19c9667dee0891394f503d180010493494e46"} Mar 01 09:21:08 crc kubenswrapper[4792]: I0301 09:21:08.832311 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" event={"ID":"fa26d667-5bcd-4849-b79d-e47e08e703d1","Type":"ContainerStarted","Data":"fe65abd025369aebfe45a140a086bc8b6b2e24c306998db6e63f2381f1ac688e"} Mar 01 09:21:09 crc kubenswrapper[4792]: I0301 09:21:09.415081 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2bd7bac-21cf-4657-ab84-68a14f99f8f0" path="/var/lib/kubelet/pods/e2bd7bac-21cf-4657-ab84-68a14f99f8f0/volumes" Mar 01 09:21:09 crc kubenswrapper[4792]: I0301 09:21:09.844669 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" event={"ID":"fa26d667-5bcd-4849-b79d-e47e08e703d1","Type":"ContainerStarted","Data":"2eae60e96dacbd394de9d78ba8aaebfeb80a83491f5e7a574680cb7c0d675d03"} Mar 01 09:21:09 crc kubenswrapper[4792]: I0301 09:21:09.844721 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" event={"ID":"fa26d667-5bcd-4849-b79d-e47e08e703d1","Type":"ContainerStarted","Data":"c52bfc2bed86d384469b53f4a7cfd14f8174bdbd6b00e0c169d18730ac906eb9"} Mar 01 09:21:09 crc kubenswrapper[4792]: I0301 09:21:09.844734 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" event={"ID":"fa26d667-5bcd-4849-b79d-e47e08e703d1","Type":"ContainerStarted","Data":"3372bcd280ade68b9a66bfecefd5d6817ff01835afca0f9d6dd203fa2b2fd016"} Mar 01 09:21:09 crc kubenswrapper[4792]: I0301 09:21:09.844745 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" event={"ID":"fa26d667-5bcd-4849-b79d-e47e08e703d1","Type":"ContainerStarted","Data":"8001de267e82d6191b4533745448927cda99dbcd39bcaadeaeedd60ea3c6a167"} Mar 01 09:21:09 crc kubenswrapper[4792]: I0301 09:21:09.844756 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" event={"ID":"fa26d667-5bcd-4849-b79d-e47e08e703d1","Type":"ContainerStarted","Data":"e87257de761d1810d6bd3514e204c87ebf71f832dc4c29e423b010f0523889bc"} Mar 01 09:21:09 crc kubenswrapper[4792]: I0301 09:21:09.844767 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" event={"ID":"fa26d667-5bcd-4849-b79d-e47e08e703d1","Type":"ContainerStarted","Data":"75347305ce5e53b1700c3a13e4c3619fd2e4bcdfbf6da5dfa5ec0c63ac1491b6"} Mar 01 09:21:11 crc kubenswrapper[4792]: I0301 09:21:11.859857 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" event={"ID":"fa26d667-5bcd-4849-b79d-e47e08e703d1","Type":"ContainerStarted","Data":"013c9d261e6682f9b171e22ddb5f4a018944dc084003803521411736649fbf53"} Mar 01 09:21:14 crc kubenswrapper[4792]: I0301 09:21:14.887548 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" event={"ID":"fa26d667-5bcd-4849-b79d-e47e08e703d1","Type":"ContainerStarted","Data":"da8fe09d99e658d57a1f211fbd8110b35a1ff986c90fd4fa4bfee69ae029b0a7"} Mar 01 09:21:14 crc kubenswrapper[4792]: I0301 09:21:14.889007 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:14 crc kubenswrapper[4792]: I0301 09:21:14.889065 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:14 crc kubenswrapper[4792]: I0301 09:21:14.890591 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:14 crc kubenswrapper[4792]: I0301 09:21:14.914229 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:14 crc kubenswrapper[4792]: I0301 09:21:14.915246 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:14 crc kubenswrapper[4792]: I0301 09:21:14.919814 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" podStartSLOduration=7.919800515 podStartE2EDuration="7.919800515s" podCreationTimestamp="2026-03-01 09:21:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:21:14.914833602 +0000 UTC m=+804.156712789" watchObservedRunningTime="2026-03-01 09:21:14.919800515 +0000 UTC m=+804.161679712" Mar 01 09:21:22 crc kubenswrapper[4792]: I0301 09:21:22.408645 4792 scope.go:117] "RemoveContainer" containerID="43e6bc214d9cf4f260a6b3f92b2bb7a5207d98f68c3fc275ce08d07a9684d65b" Mar 01 09:21:22 crc kubenswrapper[4792]: E0301 09:21:22.409400 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-pq28p_openshift-multus(ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3)\"" pod="openshift-multus/multus-pq28p" podUID="ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3" Mar 01 09:21:35 crc kubenswrapper[4792]: I0301 09:21:35.408629 4792 scope.go:117] "RemoveContainer" containerID="43e6bc214d9cf4f260a6b3f92b2bb7a5207d98f68c3fc275ce08d07a9684d65b" Mar 01 09:21:35 crc kubenswrapper[4792]: I0301 09:21:35.906161 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst"] Mar 01 09:21:35 crc kubenswrapper[4792]: I0301 09:21:35.908082 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" Mar 01 09:21:35 crc kubenswrapper[4792]: I0301 09:21:35.910390 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 01 09:21:35 crc kubenswrapper[4792]: I0301 09:21:35.915152 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst"] Mar 01 09:21:36 crc kubenswrapper[4792]: I0301 09:21:36.035387 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst\" (UID: \"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" Mar 01 09:21:36 crc kubenswrapper[4792]: I0301 09:21:36.035501 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst\" (UID: \"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" Mar 01 09:21:36 crc kubenswrapper[4792]: I0301 09:21:36.035551 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhxg2\" (UniqueName: \"kubernetes.io/projected/8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd-kube-api-access-xhxg2\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst\" (UID: \"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" Mar 01 09:21:36 crc kubenswrapper[4792]: I0301 09:21:36.042259 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pq28p_ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3/kube-multus/2.log" Mar 01 09:21:36 crc kubenswrapper[4792]: I0301 09:21:36.042315 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pq28p" event={"ID":"ad44dee2-f99e-4e77-bc6a-2ab7f39eddf3","Type":"ContainerStarted","Data":"8e17d1f3b79d200f140ec6fd0c086d624c03058d1874684b990b78a70fe1d430"} Mar 01 09:21:36 crc kubenswrapper[4792]: I0301 09:21:36.136617 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst\" (UID: \"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" Mar 01 09:21:36 crc kubenswrapper[4792]: I0301 09:21:36.136978 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst\" (UID: \"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" Mar 01 09:21:36 crc kubenswrapper[4792]: I0301 09:21:36.137035 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhxg2\" (UniqueName: \"kubernetes.io/projected/8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd-kube-api-access-xhxg2\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst\" (UID: \"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" Mar 01 09:21:36 crc kubenswrapper[4792]: I0301 09:21:36.137186 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst\" (UID: \"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" Mar 01 09:21:36 crc kubenswrapper[4792]: I0301 09:21:36.137388 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst\" (UID: \"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" Mar 01 09:21:36 crc kubenswrapper[4792]: I0301 09:21:36.156604 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhxg2\" (UniqueName: \"kubernetes.io/projected/8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd-kube-api-access-xhxg2\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst\" (UID: \"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" Mar 01 09:21:36 crc kubenswrapper[4792]: I0301 09:21:36.219826 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" Mar 01 09:21:36 crc kubenswrapper[4792]: E0301 09:21:36.243203 4792 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_openshift-marketplace_8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd_0(982d1baaac43b414f35be9de01a2fcb770603a9e76361e62b5a12971ac545955): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 01 09:21:36 crc kubenswrapper[4792]: E0301 09:21:36.243270 4792 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_openshift-marketplace_8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd_0(982d1baaac43b414f35be9de01a2fcb770603a9e76361e62b5a12971ac545955): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" Mar 01 09:21:36 crc kubenswrapper[4792]: E0301 09:21:36.243291 4792 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_openshift-marketplace_8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd_0(982d1baaac43b414f35be9de01a2fcb770603a9e76361e62b5a12971ac545955): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" Mar 01 09:21:36 crc kubenswrapper[4792]: E0301 09:21:36.243335 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_openshift-marketplace(8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_openshift-marketplace(8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_openshift-marketplace_8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd_0(982d1baaac43b414f35be9de01a2fcb770603a9e76361e62b5a12971ac545955): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" podUID="8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd" Mar 01 09:21:37 crc kubenswrapper[4792]: I0301 09:21:37.047842 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" Mar 01 09:21:37 crc kubenswrapper[4792]: I0301 09:21:37.048509 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" Mar 01 09:21:37 crc kubenswrapper[4792]: I0301 09:21:37.298781 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst"] Mar 01 09:21:38 crc kubenswrapper[4792]: I0301 09:21:38.041952 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bv7mw" Mar 01 09:21:38 crc kubenswrapper[4792]: I0301 09:21:38.056046 4792 generic.go:334] "Generic (PLEG): container finished" podID="8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd" containerID="76925cc358923b750081a35dd80c9bf1cc4033be7b2c6170e4dc21dd66f8ec00" exitCode=0 Mar 01 09:21:38 crc kubenswrapper[4792]: I0301 09:21:38.056091 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" event={"ID":"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd","Type":"ContainerDied","Data":"76925cc358923b750081a35dd80c9bf1cc4033be7b2c6170e4dc21dd66f8ec00"} Mar 01 09:21:38 crc kubenswrapper[4792]: I0301 09:21:38.056115 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" event={"ID":"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd","Type":"ContainerStarted","Data":"fa7a7493167702c92a410f30bd06af51593614b91aa3e9ddce0d957224b436d9"} Mar 01 09:21:40 crc kubenswrapper[4792]: I0301 09:21:40.067865 4792 generic.go:334] "Generic (PLEG): container finished" podID="8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd" containerID="b8da3653fca540d47b970da8d99ec257f6fa500230f30cb86cb66bf52be288aa" exitCode=0 Mar 01 09:21:40 crc kubenswrapper[4792]: I0301 09:21:40.067935 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" event={"ID":"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd","Type":"ContainerDied","Data":"b8da3653fca540d47b970da8d99ec257f6fa500230f30cb86cb66bf52be288aa"} Mar 01 09:21:40 crc kubenswrapper[4792]: E0301 09:21:40.395988 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b2fbe4e_4a71_4ce1_b7cc_3063b89d65bd.slice/crio-conmon-5d89cfe5e15e45905d7e95e396f3b50c3188d4882555350d9b4867807a42dc09.scope\": RecentStats: unable to find data in memory cache]" Mar 01 09:21:41 crc kubenswrapper[4792]: I0301 09:21:41.078055 4792 generic.go:334] "Generic (PLEG): container finished" podID="8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd" containerID="5d89cfe5e15e45905d7e95e396f3b50c3188d4882555350d9b4867807a42dc09" exitCode=0 Mar 01 09:21:41 crc kubenswrapper[4792]: I0301 09:21:41.078114 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" event={"ID":"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd","Type":"ContainerDied","Data":"5d89cfe5e15e45905d7e95e396f3b50c3188d4882555350d9b4867807a42dc09"} Mar 01 09:21:42 crc kubenswrapper[4792]: I0301 09:21:42.336444 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" Mar 01 09:21:42 crc kubenswrapper[4792]: I0301 09:21:42.517206 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd-util\") pod \"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd\" (UID: \"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd\") " Mar 01 09:21:42 crc kubenswrapper[4792]: I0301 09:21:42.517281 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhxg2\" (UniqueName: \"kubernetes.io/projected/8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd-kube-api-access-xhxg2\") pod \"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd\" (UID: \"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd\") " Mar 01 09:21:42 crc kubenswrapper[4792]: I0301 09:21:42.517315 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd-bundle\") pod \"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd\" (UID: \"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd\") " Mar 01 09:21:42 crc kubenswrapper[4792]: I0301 09:21:42.517850 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd-bundle" (OuterVolumeSpecName: "bundle") pod "8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd" (UID: "8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:21:42 crc kubenswrapper[4792]: I0301 09:21:42.525942 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd-kube-api-access-xhxg2" (OuterVolumeSpecName: "kube-api-access-xhxg2") pod "8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd" (UID: "8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd"). InnerVolumeSpecName "kube-api-access-xhxg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:21:42 crc kubenswrapper[4792]: I0301 09:21:42.534733 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd-util" (OuterVolumeSpecName: "util") pod "8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd" (UID: "8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:21:42 crc kubenswrapper[4792]: I0301 09:21:42.618973 4792 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd-util\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:42 crc kubenswrapper[4792]: I0301 09:21:42.619052 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhxg2\" (UniqueName: \"kubernetes.io/projected/8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd-kube-api-access-xhxg2\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:42 crc kubenswrapper[4792]: I0301 09:21:42.619070 4792 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:21:43 crc kubenswrapper[4792]: I0301 09:21:43.093651 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" event={"ID":"8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd","Type":"ContainerDied","Data":"fa7a7493167702c92a410f30bd06af51593614b91aa3e9ddce0d957224b436d9"} Mar 01 09:21:43 crc kubenswrapper[4792]: I0301 09:21:43.094143 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa7a7493167702c92a410f30bd06af51593614b91aa3e9ddce0d957224b436d9" Mar 01 09:21:43 crc kubenswrapper[4792]: I0301 09:21:43.094227 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst" Mar 01 09:21:44 crc kubenswrapper[4792]: I0301 09:21:44.429872 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-chfpw"] Mar 01 09:21:44 crc kubenswrapper[4792]: E0301 09:21:44.430139 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd" containerName="pull" Mar 01 09:21:44 crc kubenswrapper[4792]: I0301 09:21:44.430151 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd" containerName="pull" Mar 01 09:21:44 crc kubenswrapper[4792]: E0301 09:21:44.430160 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd" containerName="util" Mar 01 09:21:44 crc kubenswrapper[4792]: I0301 09:21:44.430167 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd" containerName="util" Mar 01 09:21:44 crc kubenswrapper[4792]: E0301 09:21:44.430177 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd" containerName="extract" Mar 01 09:21:44 crc kubenswrapper[4792]: I0301 09:21:44.430184 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd" containerName="extract" Mar 01 09:21:44 crc kubenswrapper[4792]: I0301 09:21:44.430276 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd" containerName="extract" Mar 01 09:21:44 crc kubenswrapper[4792]: I0301 09:21:44.430641 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-chfpw" Mar 01 09:21:44 crc kubenswrapper[4792]: I0301 09:21:44.432198 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-s9kc7" Mar 01 09:21:44 crc kubenswrapper[4792]: I0301 09:21:44.433418 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 01 09:21:44 crc kubenswrapper[4792]: I0301 09:21:44.434418 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 01 09:21:44 crc kubenswrapper[4792]: I0301 09:21:44.479418 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-chfpw"] Mar 01 09:21:44 crc kubenswrapper[4792]: I0301 09:21:44.540752 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9dxl\" (UniqueName: \"kubernetes.io/projected/fb942d1c-2a1a-4265-ae29-02f185d4cc40-kube-api-access-g9dxl\") pod \"nmstate-operator-75c5dccd6c-chfpw\" (UID: \"fb942d1c-2a1a-4265-ae29-02f185d4cc40\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-chfpw" Mar 01 09:21:44 crc kubenswrapper[4792]: I0301 09:21:44.642423 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9dxl\" (UniqueName: \"kubernetes.io/projected/fb942d1c-2a1a-4265-ae29-02f185d4cc40-kube-api-access-g9dxl\") pod \"nmstate-operator-75c5dccd6c-chfpw\" (UID: \"fb942d1c-2a1a-4265-ae29-02f185d4cc40\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-chfpw" Mar 01 09:21:44 crc kubenswrapper[4792]: I0301 09:21:44.658841 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9dxl\" (UniqueName: \"kubernetes.io/projected/fb942d1c-2a1a-4265-ae29-02f185d4cc40-kube-api-access-g9dxl\") pod \"nmstate-operator-75c5dccd6c-chfpw\" (UID: \"fb942d1c-2a1a-4265-ae29-02f185d4cc40\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-chfpw" Mar 01 09:21:44 crc kubenswrapper[4792]: I0301 09:21:44.744161 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-chfpw" Mar 01 09:21:44 crc kubenswrapper[4792]: I0301 09:21:44.992413 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-chfpw"] Mar 01 09:21:45 crc kubenswrapper[4792]: I0301 09:21:45.104144 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-chfpw" event={"ID":"fb942d1c-2a1a-4265-ae29-02f185d4cc40","Type":"ContainerStarted","Data":"56a3baf522f3befc255427a7e92fb6d3eaea9fe2c0ba4170d7dcb4b23f4876bc"} Mar 01 09:21:48 crc kubenswrapper[4792]: I0301 09:21:48.120790 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-chfpw" event={"ID":"fb942d1c-2a1a-4265-ae29-02f185d4cc40","Type":"ContainerStarted","Data":"246728d3a0a95716bc9c743cd4742fc1074032d3fac38b2b4c5926f20fa51c17"} Mar 01 09:21:48 crc kubenswrapper[4792]: I0301 09:21:48.134160 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-chfpw" podStartSLOduration=1.8528042660000001 podStartE2EDuration="4.134146652s" podCreationTimestamp="2026-03-01 09:21:44 +0000 UTC" firstStartedPulling="2026-03-01 09:21:44.999953816 +0000 UTC m=+834.241833023" lastFinishedPulling="2026-03-01 09:21:47.281296212 +0000 UTC m=+836.523175409" observedRunningTime="2026-03-01 09:21:48.133164757 +0000 UTC m=+837.375043954" watchObservedRunningTime="2026-03-01 09:21:48.134146652 +0000 UTC m=+837.376025849" Mar 01 09:21:48 crc kubenswrapper[4792]: I0301 09:21:48.216445 4792 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.116090 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-97cv9"] Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.116925 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-97cv9" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.118853 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-9pv2w" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.134075 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-97cv9"] Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.159034 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-zwhpc"] Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.159765 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-zwhpc" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.161659 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.172756 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-zwhpc"] Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.179053 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-9j2tz"] Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.179858 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-9j2tz" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.220392 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj7d9\" (UniqueName: \"kubernetes.io/projected/bfe2cc56-28ca-4201-ba5a-4208dd1ec818-kube-api-access-lj7d9\") pod \"nmstate-metrics-69594cc75-97cv9\" (UID: \"bfe2cc56-28ca-4201-ba5a-4208dd1ec818\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-97cv9" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.282999 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mtxkm"] Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.283785 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mtxkm" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.286525 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-kt9z4" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.286525 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.288782 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.302811 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mtxkm"] Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.321986 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkh22\" (UniqueName: \"kubernetes.io/projected/aa2300d6-10c0-4dc9-812a-fcb30f09920e-kube-api-access-hkh22\") pod \"nmstate-webhook-786f45cff4-zwhpc\" (UID: \"aa2300d6-10c0-4dc9-812a-fcb30f09920e\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-zwhpc" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.322025 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5mxc\" (UniqueName: \"kubernetes.io/projected/7105919f-ddac-45db-a8f7-bd927e5737df-kube-api-access-h5mxc\") pod \"nmstate-handler-9j2tz\" (UID: \"7105919f-ddac-45db-a8f7-bd927e5737df\") " pod="openshift-nmstate/nmstate-handler-9j2tz" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.322070 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7105919f-ddac-45db-a8f7-bd927e5737df-nmstate-lock\") pod \"nmstate-handler-9j2tz\" (UID: \"7105919f-ddac-45db-a8f7-bd927e5737df\") " pod="openshift-nmstate/nmstate-handler-9j2tz" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.322102 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7105919f-ddac-45db-a8f7-bd927e5737df-ovs-socket\") pod \"nmstate-handler-9j2tz\" (UID: \"7105919f-ddac-45db-a8f7-bd927e5737df\") " pod="openshift-nmstate/nmstate-handler-9j2tz" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.322123 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7105919f-ddac-45db-a8f7-bd927e5737df-dbus-socket\") pod \"nmstate-handler-9j2tz\" (UID: \"7105919f-ddac-45db-a8f7-bd927e5737df\") " pod="openshift-nmstate/nmstate-handler-9j2tz" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.322152 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/aa2300d6-10c0-4dc9-812a-fcb30f09920e-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-zwhpc\" (UID: \"aa2300d6-10c0-4dc9-812a-fcb30f09920e\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-zwhpc" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.322176 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj7d9\" (UniqueName: \"kubernetes.io/projected/bfe2cc56-28ca-4201-ba5a-4208dd1ec818-kube-api-access-lj7d9\") pod \"nmstate-metrics-69594cc75-97cv9\" (UID: \"bfe2cc56-28ca-4201-ba5a-4208dd1ec818\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-97cv9" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.341032 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj7d9\" (UniqueName: \"kubernetes.io/projected/bfe2cc56-28ca-4201-ba5a-4208dd1ec818-kube-api-access-lj7d9\") pod \"nmstate-metrics-69594cc75-97cv9\" (UID: \"bfe2cc56-28ca-4201-ba5a-4208dd1ec818\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-97cv9" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.423505 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkh22\" (UniqueName: \"kubernetes.io/projected/aa2300d6-10c0-4dc9-812a-fcb30f09920e-kube-api-access-hkh22\") pod \"nmstate-webhook-786f45cff4-zwhpc\" (UID: \"aa2300d6-10c0-4dc9-812a-fcb30f09920e\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-zwhpc" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.423555 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5mxc\" (UniqueName: \"kubernetes.io/projected/7105919f-ddac-45db-a8f7-bd927e5737df-kube-api-access-h5mxc\") pod \"nmstate-handler-9j2tz\" (UID: \"7105919f-ddac-45db-a8f7-bd927e5737df\") " pod="openshift-nmstate/nmstate-handler-9j2tz" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.423594 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gvz9\" (UniqueName: \"kubernetes.io/projected/f7ca92c8-f38b-4a0a-b330-5809993cbb49-kube-api-access-4gvz9\") pod \"nmstate-console-plugin-5dcbbd79cf-mtxkm\" (UID: \"f7ca92c8-f38b-4a0a-b330-5809993cbb49\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mtxkm" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.423623 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7105919f-ddac-45db-a8f7-bd927e5737df-nmstate-lock\") pod \"nmstate-handler-9j2tz\" (UID: \"7105919f-ddac-45db-a8f7-bd927e5737df\") " pod="openshift-nmstate/nmstate-handler-9j2tz" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.423650 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f7ca92c8-f38b-4a0a-b330-5809993cbb49-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-mtxkm\" (UID: \"f7ca92c8-f38b-4a0a-b330-5809993cbb49\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mtxkm" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.423721 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7105919f-ddac-45db-a8f7-bd927e5737df-ovs-socket\") pod \"nmstate-handler-9j2tz\" (UID: \"7105919f-ddac-45db-a8f7-bd927e5737df\") " pod="openshift-nmstate/nmstate-handler-9j2tz" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.423752 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7105919f-ddac-45db-a8f7-bd927e5737df-dbus-socket\") pod \"nmstate-handler-9j2tz\" (UID: \"7105919f-ddac-45db-a8f7-bd927e5737df\") " pod="openshift-nmstate/nmstate-handler-9j2tz" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.423780 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f7ca92c8-f38b-4a0a-b330-5809993cbb49-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-mtxkm\" (UID: \"f7ca92c8-f38b-4a0a-b330-5809993cbb49\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mtxkm" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.423822 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/aa2300d6-10c0-4dc9-812a-fcb30f09920e-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-zwhpc\" (UID: \"aa2300d6-10c0-4dc9-812a-fcb30f09920e\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-zwhpc" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.424497 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7105919f-ddac-45db-a8f7-bd927e5737df-nmstate-lock\") pod \"nmstate-handler-9j2tz\" (UID: \"7105919f-ddac-45db-a8f7-bd927e5737df\") " pod="openshift-nmstate/nmstate-handler-9j2tz" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.424561 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7105919f-ddac-45db-a8f7-bd927e5737df-ovs-socket\") pod \"nmstate-handler-9j2tz\" (UID: \"7105919f-ddac-45db-a8f7-bd927e5737df\") " pod="openshift-nmstate/nmstate-handler-9j2tz" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.424881 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7105919f-ddac-45db-a8f7-bd927e5737df-dbus-socket\") pod \"nmstate-handler-9j2tz\" (UID: \"7105919f-ddac-45db-a8f7-bd927e5737df\") " pod="openshift-nmstate/nmstate-handler-9j2tz" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.429633 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/aa2300d6-10c0-4dc9-812a-fcb30f09920e-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-zwhpc\" (UID: \"aa2300d6-10c0-4dc9-812a-fcb30f09920e\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-zwhpc" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.431312 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-97cv9" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.475694 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5mxc\" (UniqueName: \"kubernetes.io/projected/7105919f-ddac-45db-a8f7-bd927e5737df-kube-api-access-h5mxc\") pod \"nmstate-handler-9j2tz\" (UID: \"7105919f-ddac-45db-a8f7-bd927e5737df\") " pod="openshift-nmstate/nmstate-handler-9j2tz" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.497290 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkh22\" (UniqueName: \"kubernetes.io/projected/aa2300d6-10c0-4dc9-812a-fcb30f09920e-kube-api-access-hkh22\") pod \"nmstate-webhook-786f45cff4-zwhpc\" (UID: \"aa2300d6-10c0-4dc9-812a-fcb30f09920e\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-zwhpc" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.497546 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-9j2tz" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.525129 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gvz9\" (UniqueName: \"kubernetes.io/projected/f7ca92c8-f38b-4a0a-b330-5809993cbb49-kube-api-access-4gvz9\") pod \"nmstate-console-plugin-5dcbbd79cf-mtxkm\" (UID: \"f7ca92c8-f38b-4a0a-b330-5809993cbb49\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mtxkm" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.525177 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f7ca92c8-f38b-4a0a-b330-5809993cbb49-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-mtxkm\" (UID: \"f7ca92c8-f38b-4a0a-b330-5809993cbb49\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mtxkm" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.525224 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f7ca92c8-f38b-4a0a-b330-5809993cbb49-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-mtxkm\" (UID: \"f7ca92c8-f38b-4a0a-b330-5809993cbb49\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mtxkm" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.526576 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f7ca92c8-f38b-4a0a-b330-5809993cbb49-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-mtxkm\" (UID: \"f7ca92c8-f38b-4a0a-b330-5809993cbb49\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mtxkm" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.527827 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f7ca92c8-f38b-4a0a-b330-5809993cbb49-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-mtxkm\" (UID: \"f7ca92c8-f38b-4a0a-b330-5809993cbb49\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mtxkm" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.548831 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d787bcd-49gzg"] Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.549664 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.553801 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gvz9\" (UniqueName: \"kubernetes.io/projected/f7ca92c8-f38b-4a0a-b330-5809993cbb49-kube-api-access-4gvz9\") pod \"nmstate-console-plugin-5dcbbd79cf-mtxkm\" (UID: \"f7ca92c8-f38b-4a0a-b330-5809993cbb49\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mtxkm" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.568802 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d787bcd-49gzg"] Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.596312 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mtxkm" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.727602 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d040336d-5b5f-44e9-959d-84260224c25d-console-serving-cert\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.727885 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d040336d-5b5f-44e9-959d-84260224c25d-service-ca\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.727939 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54sqs\" (UniqueName: \"kubernetes.io/projected/d040336d-5b5f-44e9-959d-84260224c25d-kube-api-access-54sqs\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.727956 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d040336d-5b5f-44e9-959d-84260224c25d-console-config\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.727990 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d040336d-5b5f-44e9-959d-84260224c25d-console-oauth-config\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.728029 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d040336d-5b5f-44e9-959d-84260224c25d-trusted-ca-bundle\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.728081 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d040336d-5b5f-44e9-959d-84260224c25d-oauth-serving-cert\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.754469 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-97cv9"] Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.773437 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-zwhpc" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.832548 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d040336d-5b5f-44e9-959d-84260224c25d-console-oauth-config\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.832603 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d040336d-5b5f-44e9-959d-84260224c25d-trusted-ca-bundle\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.832665 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d040336d-5b5f-44e9-959d-84260224c25d-oauth-serving-cert\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.832699 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d040336d-5b5f-44e9-959d-84260224c25d-console-serving-cert\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.832733 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d040336d-5b5f-44e9-959d-84260224c25d-service-ca\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.832762 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54sqs\" (UniqueName: \"kubernetes.io/projected/d040336d-5b5f-44e9-959d-84260224c25d-kube-api-access-54sqs\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.832787 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d040336d-5b5f-44e9-959d-84260224c25d-console-config\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.833698 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d040336d-5b5f-44e9-959d-84260224c25d-oauth-serving-cert\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.833785 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d040336d-5b5f-44e9-959d-84260224c25d-console-config\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.834732 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d040336d-5b5f-44e9-959d-84260224c25d-trusted-ca-bundle\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.834776 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d040336d-5b5f-44e9-959d-84260224c25d-service-ca\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.837530 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d040336d-5b5f-44e9-959d-84260224c25d-console-serving-cert\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.838452 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d040336d-5b5f-44e9-959d-84260224c25d-console-oauth-config\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.855394 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54sqs\" (UniqueName: \"kubernetes.io/projected/d040336d-5b5f-44e9-959d-84260224c25d-kube-api-access-54sqs\") pod \"console-f9d787bcd-49gzg\" (UID: \"d040336d-5b5f-44e9-959d-84260224c25d\") " pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.874084 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:49 crc kubenswrapper[4792]: I0301 09:21:49.940640 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-zwhpc"] Mar 01 09:21:50 crc kubenswrapper[4792]: I0301 09:21:50.023616 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mtxkm"] Mar 01 09:21:50 crc kubenswrapper[4792]: W0301 09:21:50.029078 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7ca92c8_f38b_4a0a_b330_5809993cbb49.slice/crio-b19f0c79b85a6766587e8c440aa6b1232df07e88e2e35736f259bd8f65cda23f WatchSource:0}: Error finding container b19f0c79b85a6766587e8c440aa6b1232df07e88e2e35736f259bd8f65cda23f: Status 404 returned error can't find the container with id b19f0c79b85a6766587e8c440aa6b1232df07e88e2e35736f259bd8f65cda23f Mar 01 09:21:50 crc kubenswrapper[4792]: I0301 09:21:50.133835 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-9j2tz" event={"ID":"7105919f-ddac-45db-a8f7-bd927e5737df","Type":"ContainerStarted","Data":"e2ca7a86913e06245de7da9bea06c93473666473dd10d1f84831b2eca13979cd"} Mar 01 09:21:50 crc kubenswrapper[4792]: I0301 09:21:50.135194 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-97cv9" event={"ID":"bfe2cc56-28ca-4201-ba5a-4208dd1ec818","Type":"ContainerStarted","Data":"8befbe26d96643fd32f1674a35656efdebc284b5f6c25b19a271aacce8d6050a"} Mar 01 09:21:50 crc kubenswrapper[4792]: I0301 09:21:50.136111 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-zwhpc" event={"ID":"aa2300d6-10c0-4dc9-812a-fcb30f09920e","Type":"ContainerStarted","Data":"a7ee32be1ab30155d414feefebdc19ab56d1866ab4ffd457abd74dcf66fa8915"} Mar 01 09:21:50 crc kubenswrapper[4792]: I0301 09:21:50.137124 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mtxkm" event={"ID":"f7ca92c8-f38b-4a0a-b330-5809993cbb49","Type":"ContainerStarted","Data":"b19f0c79b85a6766587e8c440aa6b1232df07e88e2e35736f259bd8f65cda23f"} Mar 01 09:21:50 crc kubenswrapper[4792]: I0301 09:21:50.294082 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d787bcd-49gzg"] Mar 01 09:21:50 crc kubenswrapper[4792]: W0301 09:21:50.295032 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd040336d_5b5f_44e9_959d_84260224c25d.slice/crio-df68ab51b3fb82ea3e3c75e290c5ebe27d1af3c18b75b28c7b5448829cb26e7d WatchSource:0}: Error finding container df68ab51b3fb82ea3e3c75e290c5ebe27d1af3c18b75b28c7b5448829cb26e7d: Status 404 returned error can't find the container with id df68ab51b3fb82ea3e3c75e290c5ebe27d1af3c18b75b28c7b5448829cb26e7d Mar 01 09:21:51 crc kubenswrapper[4792]: I0301 09:21:51.145276 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d787bcd-49gzg" event={"ID":"d040336d-5b5f-44e9-959d-84260224c25d","Type":"ContainerStarted","Data":"8e3bc4647d1bcc0ace6be9cf55ca15cc2f9ebf86f45cef84623bffbddf5feedc"} Mar 01 09:21:51 crc kubenswrapper[4792]: I0301 09:21:51.145624 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d787bcd-49gzg" event={"ID":"d040336d-5b5f-44e9-959d-84260224c25d","Type":"ContainerStarted","Data":"df68ab51b3fb82ea3e3c75e290c5ebe27d1af3c18b75b28c7b5448829cb26e7d"} Mar 01 09:21:51 crc kubenswrapper[4792]: I0301 09:21:51.162875 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d787bcd-49gzg" podStartSLOduration=2.162853582 podStartE2EDuration="2.162853582s" podCreationTimestamp="2026-03-01 09:21:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:21:51.161750365 +0000 UTC m=+840.403629562" watchObservedRunningTime="2026-03-01 09:21:51.162853582 +0000 UTC m=+840.404732779" Mar 01 09:21:53 crc kubenswrapper[4792]: I0301 09:21:53.159134 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mtxkm" event={"ID":"f7ca92c8-f38b-4a0a-b330-5809993cbb49","Type":"ContainerStarted","Data":"e352eeb70fea6a32f9256ee242d703c85d61f0b059a6b9ebc168263dfa4de9e4"} Mar 01 09:21:53 crc kubenswrapper[4792]: I0301 09:21:53.161223 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-97cv9" event={"ID":"bfe2cc56-28ca-4201-ba5a-4208dd1ec818","Type":"ContainerStarted","Data":"bf5c4e80cffac78d41a00f0e98d52038426e8baa29faf6db8b2af41956c8bb77"} Mar 01 09:21:53 crc kubenswrapper[4792]: I0301 09:21:53.163147 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-zwhpc" event={"ID":"aa2300d6-10c0-4dc9-812a-fcb30f09920e","Type":"ContainerStarted","Data":"995a34fecaba5c30fc25409e04e2a82601abf24151a911e8ee750fe446f99450"} Mar 01 09:21:53 crc kubenswrapper[4792]: I0301 09:21:53.163261 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-zwhpc" Mar 01 09:21:53 crc kubenswrapper[4792]: I0301 09:21:53.177965 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mtxkm" podStartSLOduration=1.320982143 podStartE2EDuration="4.177944924s" podCreationTimestamp="2026-03-01 09:21:49 +0000 UTC" firstStartedPulling="2026-03-01 09:21:50.036832987 +0000 UTC m=+839.278712184" lastFinishedPulling="2026-03-01 09:21:52.893795768 +0000 UTC m=+842.135674965" observedRunningTime="2026-03-01 09:21:53.17492343 +0000 UTC m=+842.416802627" watchObservedRunningTime="2026-03-01 09:21:53.177944924 +0000 UTC m=+842.419824121" Mar 01 09:21:54 crc kubenswrapper[4792]: I0301 09:21:54.170114 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-9j2tz" event={"ID":"7105919f-ddac-45db-a8f7-bd927e5737df","Type":"ContainerStarted","Data":"31459e1e34644765b296e5000ebb6bb93f3dea59a8dd2dc2c50eadced7ab305a"} Mar 01 09:21:54 crc kubenswrapper[4792]: I0301 09:21:54.187896 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-9j2tz" podStartSLOduration=1.8489637220000001 podStartE2EDuration="5.187876664s" podCreationTimestamp="2026-03-01 09:21:49 +0000 UTC" firstStartedPulling="2026-03-01 09:21:49.557138451 +0000 UTC m=+838.799017648" lastFinishedPulling="2026-03-01 09:21:52.896051383 +0000 UTC m=+842.137930590" observedRunningTime="2026-03-01 09:21:54.185936566 +0000 UTC m=+843.427815773" watchObservedRunningTime="2026-03-01 09:21:54.187876664 +0000 UTC m=+843.429755861" Mar 01 09:21:54 crc kubenswrapper[4792]: I0301 09:21:54.188743 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-zwhpc" podStartSLOduration=2.22955115 podStartE2EDuration="5.188735515s" podCreationTimestamp="2026-03-01 09:21:49 +0000 UTC" firstStartedPulling="2026-03-01 09:21:49.956217356 +0000 UTC m=+839.198096553" lastFinishedPulling="2026-03-01 09:21:52.915401721 +0000 UTC m=+842.157280918" observedRunningTime="2026-03-01 09:21:53.204998632 +0000 UTC m=+842.446877819" watchObservedRunningTime="2026-03-01 09:21:54.188735515 +0000 UTC m=+843.430614712" Mar 01 09:21:54 crc kubenswrapper[4792]: I0301 09:21:54.498828 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-9j2tz" Mar 01 09:21:56 crc kubenswrapper[4792]: I0301 09:21:56.183588 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-97cv9" event={"ID":"bfe2cc56-28ca-4201-ba5a-4208dd1ec818","Type":"ContainerStarted","Data":"e0f6095afec3a06107aabffef4c0efee5b1f77d77ab71fbe9c2b8cadaaa723dc"} Mar 01 09:21:56 crc kubenswrapper[4792]: I0301 09:21:56.205775 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-97cv9" podStartSLOduration=1.773733074 podStartE2EDuration="7.205743413s" podCreationTimestamp="2026-03-01 09:21:49 +0000 UTC" firstStartedPulling="2026-03-01 09:21:49.769956267 +0000 UTC m=+839.011835464" lastFinishedPulling="2026-03-01 09:21:55.201966606 +0000 UTC m=+844.443845803" observedRunningTime="2026-03-01 09:21:56.203821486 +0000 UTC m=+845.445700723" watchObservedRunningTime="2026-03-01 09:21:56.205743413 +0000 UTC m=+845.447622660" Mar 01 09:21:59 crc kubenswrapper[4792]: I0301 09:21:59.525051 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-9j2tz" Mar 01 09:21:59 crc kubenswrapper[4792]: I0301 09:21:59.874467 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:59 crc kubenswrapper[4792]: I0301 09:21:59.874510 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:21:59 crc kubenswrapper[4792]: I0301 09:21:59.879396 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:22:00 crc kubenswrapper[4792]: I0301 09:22:00.128620 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539282-dcfkb"] Mar 01 09:22:00 crc kubenswrapper[4792]: I0301 09:22:00.129282 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539282-dcfkb" Mar 01 09:22:00 crc kubenswrapper[4792]: I0301 09:22:00.132125 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:22:00 crc kubenswrapper[4792]: I0301 09:22:00.132478 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:22:00 crc kubenswrapper[4792]: I0301 09:22:00.134017 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:22:00 crc kubenswrapper[4792]: I0301 09:22:00.138993 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539282-dcfkb"] Mar 01 09:22:00 crc kubenswrapper[4792]: I0301 09:22:00.214348 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d787bcd-49gzg" Mar 01 09:22:00 crc kubenswrapper[4792]: I0301 09:22:00.280034 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdhdv\" (UniqueName: \"kubernetes.io/projected/1fcb7c96-6ab5-413c-b776-d1bc938e85c0-kube-api-access-xdhdv\") pod \"auto-csr-approver-29539282-dcfkb\" (UID: \"1fcb7c96-6ab5-413c-b776-d1bc938e85c0\") " pod="openshift-infra/auto-csr-approver-29539282-dcfkb" Mar 01 09:22:00 crc kubenswrapper[4792]: I0301 09:22:00.298447 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-zrzcg"] Mar 01 09:22:00 crc kubenswrapper[4792]: I0301 09:22:00.380931 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdhdv\" (UniqueName: \"kubernetes.io/projected/1fcb7c96-6ab5-413c-b776-d1bc938e85c0-kube-api-access-xdhdv\") pod \"auto-csr-approver-29539282-dcfkb\" (UID: \"1fcb7c96-6ab5-413c-b776-d1bc938e85c0\") " pod="openshift-infra/auto-csr-approver-29539282-dcfkb" Mar 01 09:22:00 crc kubenswrapper[4792]: I0301 09:22:00.406748 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdhdv\" (UniqueName: \"kubernetes.io/projected/1fcb7c96-6ab5-413c-b776-d1bc938e85c0-kube-api-access-xdhdv\") pod \"auto-csr-approver-29539282-dcfkb\" (UID: \"1fcb7c96-6ab5-413c-b776-d1bc938e85c0\") " pod="openshift-infra/auto-csr-approver-29539282-dcfkb" Mar 01 09:22:00 crc kubenswrapper[4792]: I0301 09:22:00.481383 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539282-dcfkb" Mar 01 09:22:00 crc kubenswrapper[4792]: I0301 09:22:00.855489 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539282-dcfkb"] Mar 01 09:22:00 crc kubenswrapper[4792]: W0301 09:22:00.859241 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fcb7c96_6ab5_413c_b776_d1bc938e85c0.slice/crio-9ed4209950b7dc0c0219f7cbf98dfb2fd0f042c8e8d978db82b2efba327f129a WatchSource:0}: Error finding container 9ed4209950b7dc0c0219f7cbf98dfb2fd0f042c8e8d978db82b2efba327f129a: Status 404 returned error can't find the container with id 9ed4209950b7dc0c0219f7cbf98dfb2fd0f042c8e8d978db82b2efba327f129a Mar 01 09:22:01 crc kubenswrapper[4792]: I0301 09:22:01.217610 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539282-dcfkb" event={"ID":"1fcb7c96-6ab5-413c-b776-d1bc938e85c0","Type":"ContainerStarted","Data":"9ed4209950b7dc0c0219f7cbf98dfb2fd0f042c8e8d978db82b2efba327f129a"} Mar 01 09:22:02 crc kubenswrapper[4792]: I0301 09:22:02.225562 4792 generic.go:334] "Generic (PLEG): container finished" podID="1fcb7c96-6ab5-413c-b776-d1bc938e85c0" containerID="b3416cff442b7b3bec1893fb5c0aa2d61087db5d4679dae5ed62f8ea4a150ca7" exitCode=0 Mar 01 09:22:02 crc kubenswrapper[4792]: I0301 09:22:02.225644 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539282-dcfkb" event={"ID":"1fcb7c96-6ab5-413c-b776-d1bc938e85c0","Type":"ContainerDied","Data":"b3416cff442b7b3bec1893fb5c0aa2d61087db5d4679dae5ed62f8ea4a150ca7"} Mar 01 09:22:03 crc kubenswrapper[4792]: I0301 09:22:03.482075 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539282-dcfkb" Mar 01 09:22:03 crc kubenswrapper[4792]: I0301 09:22:03.620638 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdhdv\" (UniqueName: \"kubernetes.io/projected/1fcb7c96-6ab5-413c-b776-d1bc938e85c0-kube-api-access-xdhdv\") pod \"1fcb7c96-6ab5-413c-b776-d1bc938e85c0\" (UID: \"1fcb7c96-6ab5-413c-b776-d1bc938e85c0\") " Mar 01 09:22:03 crc kubenswrapper[4792]: I0301 09:22:03.629351 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fcb7c96-6ab5-413c-b776-d1bc938e85c0-kube-api-access-xdhdv" (OuterVolumeSpecName: "kube-api-access-xdhdv") pod "1fcb7c96-6ab5-413c-b776-d1bc938e85c0" (UID: "1fcb7c96-6ab5-413c-b776-d1bc938e85c0"). InnerVolumeSpecName "kube-api-access-xdhdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:22:03 crc kubenswrapper[4792]: I0301 09:22:03.723975 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdhdv\" (UniqueName: \"kubernetes.io/projected/1fcb7c96-6ab5-413c-b776-d1bc938e85c0-kube-api-access-xdhdv\") on node \"crc\" DevicePath \"\"" Mar 01 09:22:04 crc kubenswrapper[4792]: I0301 09:22:04.241231 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539282-dcfkb" event={"ID":"1fcb7c96-6ab5-413c-b776-d1bc938e85c0","Type":"ContainerDied","Data":"9ed4209950b7dc0c0219f7cbf98dfb2fd0f042c8e8d978db82b2efba327f129a"} Mar 01 09:22:04 crc kubenswrapper[4792]: I0301 09:22:04.241266 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ed4209950b7dc0c0219f7cbf98dfb2fd0f042c8e8d978db82b2efba327f129a" Mar 01 09:22:04 crc kubenswrapper[4792]: I0301 09:22:04.241289 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539282-dcfkb" Mar 01 09:22:04 crc kubenswrapper[4792]: I0301 09:22:04.549139 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539276-sbq86"] Mar 01 09:22:04 crc kubenswrapper[4792]: I0301 09:22:04.556075 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539276-sbq86"] Mar 01 09:22:05 crc kubenswrapper[4792]: I0301 09:22:05.416158 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ffc24e3-2b40-4368-91a7-474239cc46fc" path="/var/lib/kubelet/pods/0ffc24e3-2b40-4368-91a7-474239cc46fc/volumes" Mar 01 09:22:09 crc kubenswrapper[4792]: I0301 09:22:09.778555 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-zwhpc" Mar 01 09:22:22 crc kubenswrapper[4792]: I0301 09:22:22.850602 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t"] Mar 01 09:22:22 crc kubenswrapper[4792]: E0301 09:22:22.852338 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fcb7c96-6ab5-413c-b776-d1bc938e85c0" containerName="oc" Mar 01 09:22:22 crc kubenswrapper[4792]: I0301 09:22:22.852426 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fcb7c96-6ab5-413c-b776-d1bc938e85c0" containerName="oc" Mar 01 09:22:22 crc kubenswrapper[4792]: I0301 09:22:22.852620 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fcb7c96-6ab5-413c-b776-d1bc938e85c0" containerName="oc" Mar 01 09:22:22 crc kubenswrapper[4792]: I0301 09:22:22.853463 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t" Mar 01 09:22:22 crc kubenswrapper[4792]: I0301 09:22:22.871963 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 01 09:22:22 crc kubenswrapper[4792]: I0301 09:22:22.877685 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t"] Mar 01 09:22:22 crc kubenswrapper[4792]: I0301 09:22:22.915271 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzcbl\" (UniqueName: \"kubernetes.io/projected/a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4-kube-api-access-pzcbl\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t\" (UID: \"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t" Mar 01 09:22:22 crc kubenswrapper[4792]: I0301 09:22:22.915350 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t\" (UID: \"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t" Mar 01 09:22:22 crc kubenswrapper[4792]: I0301 09:22:22.915415 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t\" (UID: \"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t" Mar 01 09:22:23 crc kubenswrapper[4792]: I0301 09:22:23.016983 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzcbl\" (UniqueName: \"kubernetes.io/projected/a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4-kube-api-access-pzcbl\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t\" (UID: \"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t" Mar 01 09:22:23 crc kubenswrapper[4792]: I0301 09:22:23.017501 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t\" (UID: \"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t" Mar 01 09:22:23 crc kubenswrapper[4792]: I0301 09:22:23.018118 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t\" (UID: \"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t" Mar 01 09:22:23 crc kubenswrapper[4792]: I0301 09:22:23.018058 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t\" (UID: \"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t" Mar 01 09:22:23 crc kubenswrapper[4792]: I0301 09:22:23.018407 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t\" (UID: \"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t" Mar 01 09:22:23 crc kubenswrapper[4792]: I0301 09:22:23.041781 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzcbl\" (UniqueName: \"kubernetes.io/projected/a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4-kube-api-access-pzcbl\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t\" (UID: \"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t" Mar 01 09:22:23 crc kubenswrapper[4792]: I0301 09:22:23.190065 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t" Mar 01 09:22:23 crc kubenswrapper[4792]: I0301 09:22:23.592223 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t"] Mar 01 09:22:23 crc kubenswrapper[4792]: W0301 09:22:23.605997 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7c3d28a_4f36_4a3c_a4f6_793a5f945cd4.slice/crio-d09e4aa9d099d0f5855da4eafb998a6a9b4f330a892524865861519e11e6a71e WatchSource:0}: Error finding container d09e4aa9d099d0f5855da4eafb998a6a9b4f330a892524865861519e11e6a71e: Status 404 returned error can't find the container with id d09e4aa9d099d0f5855da4eafb998a6a9b4f330a892524865861519e11e6a71e Mar 01 09:22:24 crc kubenswrapper[4792]: I0301 09:22:24.375123 4792 generic.go:334] "Generic (PLEG): container finished" podID="a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4" containerID="4470a78593c70df9a7f1348a7ba2122a59c8f34035c2eb3e36dfe3908e9d37a9" exitCode=0 Mar 01 09:22:24 crc kubenswrapper[4792]: I0301 09:22:24.375425 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t" event={"ID":"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4","Type":"ContainerDied","Data":"4470a78593c70df9a7f1348a7ba2122a59c8f34035c2eb3e36dfe3908e9d37a9"} Mar 01 09:22:24 crc kubenswrapper[4792]: I0301 09:22:24.376408 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t" event={"ID":"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4","Type":"ContainerStarted","Data":"d09e4aa9d099d0f5855da4eafb998a6a9b4f330a892524865861519e11e6a71e"} Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.230406 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hw8r6"] Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.232198 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hw8r6" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.239023 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hw8r6"] Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.248949 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2905773e-c73d-4965-83a8-b1eff758a9b6-catalog-content\") pod \"redhat-operators-hw8r6\" (UID: \"2905773e-c73d-4965-83a8-b1eff758a9b6\") " pod="openshift-marketplace/redhat-operators-hw8r6" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.249058 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssnqt\" (UniqueName: \"kubernetes.io/projected/2905773e-c73d-4965-83a8-b1eff758a9b6-kube-api-access-ssnqt\") pod \"redhat-operators-hw8r6\" (UID: \"2905773e-c73d-4965-83a8-b1eff758a9b6\") " pod="openshift-marketplace/redhat-operators-hw8r6" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.249116 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2905773e-c73d-4965-83a8-b1eff758a9b6-utilities\") pod \"redhat-operators-hw8r6\" (UID: \"2905773e-c73d-4965-83a8-b1eff758a9b6\") " pod="openshift-marketplace/redhat-operators-hw8r6" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.350366 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssnqt\" (UniqueName: \"kubernetes.io/projected/2905773e-c73d-4965-83a8-b1eff758a9b6-kube-api-access-ssnqt\") pod \"redhat-operators-hw8r6\" (UID: \"2905773e-c73d-4965-83a8-b1eff758a9b6\") " pod="openshift-marketplace/redhat-operators-hw8r6" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.350434 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2905773e-c73d-4965-83a8-b1eff758a9b6-utilities\") pod \"redhat-operators-hw8r6\" (UID: \"2905773e-c73d-4965-83a8-b1eff758a9b6\") " pod="openshift-marketplace/redhat-operators-hw8r6" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.350513 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2905773e-c73d-4965-83a8-b1eff758a9b6-catalog-content\") pod \"redhat-operators-hw8r6\" (UID: \"2905773e-c73d-4965-83a8-b1eff758a9b6\") " pod="openshift-marketplace/redhat-operators-hw8r6" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.351083 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2905773e-c73d-4965-83a8-b1eff758a9b6-catalog-content\") pod \"redhat-operators-hw8r6\" (UID: \"2905773e-c73d-4965-83a8-b1eff758a9b6\") " pod="openshift-marketplace/redhat-operators-hw8r6" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.351236 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2905773e-c73d-4965-83a8-b1eff758a9b6-utilities\") pod \"redhat-operators-hw8r6\" (UID: \"2905773e-c73d-4965-83a8-b1eff758a9b6\") " pod="openshift-marketplace/redhat-operators-hw8r6" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.352820 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-zrzcg" podUID="86788093-42e5-4fa0-9595-97a910e6557e" containerName="console" containerID="cri-o://80316d2117c800289193f2a4dcfeab49966269af79c4c3f4ca03bd69d11da184" gracePeriod=15 Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.377638 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssnqt\" (UniqueName: \"kubernetes.io/projected/2905773e-c73d-4965-83a8-b1eff758a9b6-kube-api-access-ssnqt\") pod \"redhat-operators-hw8r6\" (UID: \"2905773e-c73d-4965-83a8-b1eff758a9b6\") " pod="openshift-marketplace/redhat-operators-hw8r6" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.562353 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hw8r6" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.738832 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-zrzcg_86788093-42e5-4fa0-9595-97a910e6557e/console/0.log" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.739150 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.757438 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/86788093-42e5-4fa0-9595-97a910e6557e-console-serving-cert\") pod \"86788093-42e5-4fa0-9595-97a910e6557e\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.757484 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-service-ca\") pod \"86788093-42e5-4fa0-9595-97a910e6557e\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.757535 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-oauth-serving-cert\") pod \"86788093-42e5-4fa0-9595-97a910e6557e\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.757563 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qmx8\" (UniqueName: \"kubernetes.io/projected/86788093-42e5-4fa0-9595-97a910e6557e-kube-api-access-6qmx8\") pod \"86788093-42e5-4fa0-9595-97a910e6557e\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.757594 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-console-config\") pod \"86788093-42e5-4fa0-9595-97a910e6557e\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.757622 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-trusted-ca-bundle\") pod \"86788093-42e5-4fa0-9595-97a910e6557e\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.757682 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/86788093-42e5-4fa0-9595-97a910e6557e-console-oauth-config\") pod \"86788093-42e5-4fa0-9595-97a910e6557e\" (UID: \"86788093-42e5-4fa0-9595-97a910e6557e\") " Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.758675 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-console-config" (OuterVolumeSpecName: "console-config") pod "86788093-42e5-4fa0-9595-97a910e6557e" (UID: "86788093-42e5-4fa0-9595-97a910e6557e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.758985 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "86788093-42e5-4fa0-9595-97a910e6557e" (UID: "86788093-42e5-4fa0-9595-97a910e6557e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.759224 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "86788093-42e5-4fa0-9595-97a910e6557e" (UID: "86788093-42e5-4fa0-9595-97a910e6557e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.766602 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-service-ca" (OuterVolumeSpecName: "service-ca") pod "86788093-42e5-4fa0-9595-97a910e6557e" (UID: "86788093-42e5-4fa0-9595-97a910e6557e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.767029 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86788093-42e5-4fa0-9595-97a910e6557e-kube-api-access-6qmx8" (OuterVolumeSpecName: "kube-api-access-6qmx8") pod "86788093-42e5-4fa0-9595-97a910e6557e" (UID: "86788093-42e5-4fa0-9595-97a910e6557e"). InnerVolumeSpecName "kube-api-access-6qmx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.768172 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86788093-42e5-4fa0-9595-97a910e6557e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "86788093-42e5-4fa0-9595-97a910e6557e" (UID: "86788093-42e5-4fa0-9595-97a910e6557e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.768463 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86788093-42e5-4fa0-9595-97a910e6557e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "86788093-42e5-4fa0-9595-97a910e6557e" (UID: "86788093-42e5-4fa0-9595-97a910e6557e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.798439 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hw8r6"] Mar 01 09:22:25 crc kubenswrapper[4792]: W0301 09:22:25.807215 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2905773e_c73d_4965_83a8_b1eff758a9b6.slice/crio-fc49f8e966aa2484369924988e53dd7acfdccbf1699d09f3aa942c9391b8496b WatchSource:0}: Error finding container fc49f8e966aa2484369924988e53dd7acfdccbf1699d09f3aa942c9391b8496b: Status 404 returned error can't find the container with id fc49f8e966aa2484369924988e53dd7acfdccbf1699d09f3aa942c9391b8496b Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.858371 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.858398 4792 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/86788093-42e5-4fa0-9595-97a910e6557e-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.858408 4792 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/86788093-42e5-4fa0-9595-97a910e6557e-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.858417 4792 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-service-ca\") on node \"crc\" DevicePath \"\"" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.858426 4792 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.858435 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qmx8\" (UniqueName: \"kubernetes.io/projected/86788093-42e5-4fa0-9595-97a910e6557e-kube-api-access-6qmx8\") on node \"crc\" DevicePath \"\"" Mar 01 09:22:25 crc kubenswrapper[4792]: I0301 09:22:25.858443 4792 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/86788093-42e5-4fa0-9595-97a910e6557e-console-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:22:26 crc kubenswrapper[4792]: I0301 09:22:26.397441 4792 generic.go:334] "Generic (PLEG): container finished" podID="a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4" containerID="b1412ef8ba2536904ff48ce6977ac97f0c2513be0b40dc923238218df7f2cb91" exitCode=0 Mar 01 09:22:26 crc kubenswrapper[4792]: I0301 09:22:26.397594 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t" event={"ID":"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4","Type":"ContainerDied","Data":"b1412ef8ba2536904ff48ce6977ac97f0c2513be0b40dc923238218df7f2cb91"} Mar 01 09:22:26 crc kubenswrapper[4792]: I0301 09:22:26.399728 4792 generic.go:334] "Generic (PLEG): container finished" podID="2905773e-c73d-4965-83a8-b1eff758a9b6" containerID="f68e5b68249b6c436545d18ed3e8511507d143daee0d4270a3aa78de08afac15" exitCode=0 Mar 01 09:22:26 crc kubenswrapper[4792]: I0301 09:22:26.399788 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hw8r6" event={"ID":"2905773e-c73d-4965-83a8-b1eff758a9b6","Type":"ContainerDied","Data":"f68e5b68249b6c436545d18ed3e8511507d143daee0d4270a3aa78de08afac15"} Mar 01 09:22:26 crc kubenswrapper[4792]: I0301 09:22:26.399805 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hw8r6" event={"ID":"2905773e-c73d-4965-83a8-b1eff758a9b6","Type":"ContainerStarted","Data":"fc49f8e966aa2484369924988e53dd7acfdccbf1699d09f3aa942c9391b8496b"} Mar 01 09:22:26 crc kubenswrapper[4792]: I0301 09:22:26.409605 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-zrzcg_86788093-42e5-4fa0-9595-97a910e6557e/console/0.log" Mar 01 09:22:26 crc kubenswrapper[4792]: I0301 09:22:26.409653 4792 generic.go:334] "Generic (PLEG): container finished" podID="86788093-42e5-4fa0-9595-97a910e6557e" containerID="80316d2117c800289193f2a4dcfeab49966269af79c4c3f4ca03bd69d11da184" exitCode=2 Mar 01 09:22:26 crc kubenswrapper[4792]: I0301 09:22:26.409683 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zrzcg" event={"ID":"86788093-42e5-4fa0-9595-97a910e6557e","Type":"ContainerDied","Data":"80316d2117c800289193f2a4dcfeab49966269af79c4c3f4ca03bd69d11da184"} Mar 01 09:22:26 crc kubenswrapper[4792]: I0301 09:22:26.409710 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zrzcg" event={"ID":"86788093-42e5-4fa0-9595-97a910e6557e","Type":"ContainerDied","Data":"d8eca6eb41f8b18cd9f4704945f3f7da8382b69f90c64f182bda36eef645951e"} Mar 01 09:22:26 crc kubenswrapper[4792]: I0301 09:22:26.409733 4792 scope.go:117] "RemoveContainer" containerID="80316d2117c800289193f2a4dcfeab49966269af79c4c3f4ca03bd69d11da184" Mar 01 09:22:26 crc kubenswrapper[4792]: I0301 09:22:26.409939 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zrzcg" Mar 01 09:22:26 crc kubenswrapper[4792]: I0301 09:22:26.487741 4792 scope.go:117] "RemoveContainer" containerID="80316d2117c800289193f2a4dcfeab49966269af79c4c3f4ca03bd69d11da184" Mar 01 09:22:26 crc kubenswrapper[4792]: E0301 09:22:26.497073 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80316d2117c800289193f2a4dcfeab49966269af79c4c3f4ca03bd69d11da184\": container with ID starting with 80316d2117c800289193f2a4dcfeab49966269af79c4c3f4ca03bd69d11da184 not found: ID does not exist" containerID="80316d2117c800289193f2a4dcfeab49966269af79c4c3f4ca03bd69d11da184" Mar 01 09:22:26 crc kubenswrapper[4792]: I0301 09:22:26.497119 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80316d2117c800289193f2a4dcfeab49966269af79c4c3f4ca03bd69d11da184"} err="failed to get container status \"80316d2117c800289193f2a4dcfeab49966269af79c4c3f4ca03bd69d11da184\": rpc error: code = NotFound desc = could not find container \"80316d2117c800289193f2a4dcfeab49966269af79c4c3f4ca03bd69d11da184\": container with ID starting with 80316d2117c800289193f2a4dcfeab49966269af79c4c3f4ca03bd69d11da184 not found: ID does not exist" Mar 01 09:22:26 crc kubenswrapper[4792]: I0301 09:22:26.502662 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-zrzcg"] Mar 01 09:22:26 crc kubenswrapper[4792]: I0301 09:22:26.509296 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-zrzcg"] Mar 01 09:22:27 crc kubenswrapper[4792]: I0301 09:22:27.420148 4792 generic.go:334] "Generic (PLEG): container finished" podID="a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4" containerID="d67d114a432146643ca9daddb1ebead9e2e0d67e92df3f14f252b306bf3674b1" exitCode=0 Mar 01 09:22:27 crc kubenswrapper[4792]: I0301 09:22:27.421612 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86788093-42e5-4fa0-9595-97a910e6557e" path="/var/lib/kubelet/pods/86788093-42e5-4fa0-9595-97a910e6557e/volumes" Mar 01 09:22:27 crc kubenswrapper[4792]: I0301 09:22:27.422478 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t" event={"ID":"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4","Type":"ContainerDied","Data":"d67d114a432146643ca9daddb1ebead9e2e0d67e92df3f14f252b306bf3674b1"} Mar 01 09:22:28 crc kubenswrapper[4792]: I0301 09:22:28.431016 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hw8r6" event={"ID":"2905773e-c73d-4965-83a8-b1eff758a9b6","Type":"ContainerStarted","Data":"f501896b6f9fd28b97abc647b2887daf59ebd8100e2ca875ad728ab54e2ea400"} Mar 01 09:22:28 crc kubenswrapper[4792]: I0301 09:22:28.654339 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t" Mar 01 09:22:28 crc kubenswrapper[4792]: I0301 09:22:28.700463 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4-util\") pod \"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4\" (UID: \"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4\") " Mar 01 09:22:28 crc kubenswrapper[4792]: I0301 09:22:28.700530 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4-bundle\") pod \"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4\" (UID: \"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4\") " Mar 01 09:22:28 crc kubenswrapper[4792]: I0301 09:22:28.700601 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzcbl\" (UniqueName: \"kubernetes.io/projected/a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4-kube-api-access-pzcbl\") pod \"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4\" (UID: \"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4\") " Mar 01 09:22:28 crc kubenswrapper[4792]: I0301 09:22:28.701426 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4-bundle" (OuterVolumeSpecName: "bundle") pod "a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4" (UID: "a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:22:28 crc kubenswrapper[4792]: I0301 09:22:28.706073 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4-kube-api-access-pzcbl" (OuterVolumeSpecName: "kube-api-access-pzcbl") pod "a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4" (UID: "a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4"). InnerVolumeSpecName "kube-api-access-pzcbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:22:28 crc kubenswrapper[4792]: I0301 09:22:28.721926 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4-util" (OuterVolumeSpecName: "util") pod "a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4" (UID: "a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:22:28 crc kubenswrapper[4792]: I0301 09:22:28.801872 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzcbl\" (UniqueName: \"kubernetes.io/projected/a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4-kube-api-access-pzcbl\") on node \"crc\" DevicePath \"\"" Mar 01 09:22:28 crc kubenswrapper[4792]: I0301 09:22:28.801930 4792 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4-util\") on node \"crc\" DevicePath \"\"" Mar 01 09:22:28 crc kubenswrapper[4792]: I0301 09:22:28.801948 4792 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:22:29 crc kubenswrapper[4792]: I0301 09:22:29.445631 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t" event={"ID":"a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4","Type":"ContainerDied","Data":"d09e4aa9d099d0f5855da4eafb998a6a9b4f330a892524865861519e11e6a71e"} Mar 01 09:22:29 crc kubenswrapper[4792]: I0301 09:22:29.446161 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d09e4aa9d099d0f5855da4eafb998a6a9b4f330a892524865861519e11e6a71e" Mar 01 09:22:29 crc kubenswrapper[4792]: I0301 09:22:29.445656 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t" Mar 01 09:22:29 crc kubenswrapper[4792]: I0301 09:22:29.452008 4792 generic.go:334] "Generic (PLEG): container finished" podID="2905773e-c73d-4965-83a8-b1eff758a9b6" containerID="f501896b6f9fd28b97abc647b2887daf59ebd8100e2ca875ad728ab54e2ea400" exitCode=0 Mar 01 09:22:29 crc kubenswrapper[4792]: I0301 09:22:29.452062 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hw8r6" event={"ID":"2905773e-c73d-4965-83a8-b1eff758a9b6","Type":"ContainerDied","Data":"f501896b6f9fd28b97abc647b2887daf59ebd8100e2ca875ad728ab54e2ea400"} Mar 01 09:22:30 crc kubenswrapper[4792]: I0301 09:22:30.459629 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hw8r6" event={"ID":"2905773e-c73d-4965-83a8-b1eff758a9b6","Type":"ContainerStarted","Data":"cc13c02cfab236dc86f662df07250f98a1840645f3d512661c31a12e8c7b0f08"} Mar 01 09:22:30 crc kubenswrapper[4792]: I0301 09:22:30.477230 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hw8r6" podStartSLOduration=2.073781403 podStartE2EDuration="5.477214367s" podCreationTimestamp="2026-03-01 09:22:25 +0000 UTC" firstStartedPulling="2026-03-01 09:22:26.426400347 +0000 UTC m=+875.668279544" lastFinishedPulling="2026-03-01 09:22:29.829833281 +0000 UTC m=+879.071712508" observedRunningTime="2026-03-01 09:22:30.475826443 +0000 UTC m=+879.717705640" watchObservedRunningTime="2026-03-01 09:22:30.477214367 +0000 UTC m=+879.719093564" Mar 01 09:22:34 crc kubenswrapper[4792]: I0301 09:22:34.942545 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:22:34 crc kubenswrapper[4792]: I0301 09:22:34.943114 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:22:35 crc kubenswrapper[4792]: I0301 09:22:35.563059 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hw8r6" Mar 01 09:22:35 crc kubenswrapper[4792]: I0301 09:22:35.563112 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hw8r6" Mar 01 09:22:36 crc kubenswrapper[4792]: I0301 09:22:36.603652 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hw8r6" podUID="2905773e-c73d-4965-83a8-b1eff758a9b6" containerName="registry-server" probeResult="failure" output=< Mar 01 09:22:36 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 09:22:36 crc kubenswrapper[4792]: > Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.321361 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz"] Mar 01 09:22:38 crc kubenswrapper[4792]: E0301 09:22:38.321815 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4" containerName="util" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.321827 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4" containerName="util" Mar 01 09:22:38 crc kubenswrapper[4792]: E0301 09:22:38.321839 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4" containerName="extract" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.321846 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4" containerName="extract" Mar 01 09:22:38 crc kubenswrapper[4792]: E0301 09:22:38.321861 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86788093-42e5-4fa0-9595-97a910e6557e" containerName="console" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.321867 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="86788093-42e5-4fa0-9595-97a910e6557e" containerName="console" Mar 01 09:22:38 crc kubenswrapper[4792]: E0301 09:22:38.321876 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4" containerName="pull" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.321882 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4" containerName="pull" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.321988 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="86788093-42e5-4fa0-9595-97a910e6557e" containerName="console" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.321999 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4" containerName="extract" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.322338 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.326645 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.327118 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.327282 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.327362 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.327470 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-2g8xw" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.357056 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz"] Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.431845 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rrzh\" (UniqueName: \"kubernetes.io/projected/ba22e25a-31e8-4ca7-b169-f7433eda818b-kube-api-access-5rrzh\") pod \"metallb-operator-controller-manager-5cd84fcfbc-lrpmz\" (UID: \"ba22e25a-31e8-4ca7-b169-f7433eda818b\") " pod="metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.432136 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ba22e25a-31e8-4ca7-b169-f7433eda818b-webhook-cert\") pod \"metallb-operator-controller-manager-5cd84fcfbc-lrpmz\" (UID: \"ba22e25a-31e8-4ca7-b169-f7433eda818b\") " pod="metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.432669 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ba22e25a-31e8-4ca7-b169-f7433eda818b-apiservice-cert\") pod \"metallb-operator-controller-manager-5cd84fcfbc-lrpmz\" (UID: \"ba22e25a-31e8-4ca7-b169-f7433eda818b\") " pod="metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.534521 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rrzh\" (UniqueName: \"kubernetes.io/projected/ba22e25a-31e8-4ca7-b169-f7433eda818b-kube-api-access-5rrzh\") pod \"metallb-operator-controller-manager-5cd84fcfbc-lrpmz\" (UID: \"ba22e25a-31e8-4ca7-b169-f7433eda818b\") " pod="metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.534816 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ba22e25a-31e8-4ca7-b169-f7433eda818b-webhook-cert\") pod \"metallb-operator-controller-manager-5cd84fcfbc-lrpmz\" (UID: \"ba22e25a-31e8-4ca7-b169-f7433eda818b\") " pod="metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.535829 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ba22e25a-31e8-4ca7-b169-f7433eda818b-apiservice-cert\") pod \"metallb-operator-controller-manager-5cd84fcfbc-lrpmz\" (UID: \"ba22e25a-31e8-4ca7-b169-f7433eda818b\") " pod="metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.540784 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ba22e25a-31e8-4ca7-b169-f7433eda818b-webhook-cert\") pod \"metallb-operator-controller-manager-5cd84fcfbc-lrpmz\" (UID: \"ba22e25a-31e8-4ca7-b169-f7433eda818b\") " pod="metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.541315 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ba22e25a-31e8-4ca7-b169-f7433eda818b-apiservice-cert\") pod \"metallb-operator-controller-manager-5cd84fcfbc-lrpmz\" (UID: \"ba22e25a-31e8-4ca7-b169-f7433eda818b\") " pod="metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.561531 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rrzh\" (UniqueName: \"kubernetes.io/projected/ba22e25a-31e8-4ca7-b169-f7433eda818b-kube-api-access-5rrzh\") pod \"metallb-operator-controller-manager-5cd84fcfbc-lrpmz\" (UID: \"ba22e25a-31e8-4ca7-b169-f7433eda818b\") " pod="metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.637280 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.671847 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6"] Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.673056 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.676512 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-52bcm" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.676712 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.677510 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.751379 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6"] Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.839621 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cf86866e-8afa-44da-a688-e1c018a025bd-webhook-cert\") pod \"metallb-operator-webhook-server-776c7d78bd-jwfh6\" (UID: \"cf86866e-8afa-44da-a688-e1c018a025bd\") " pod="metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.839666 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xn97\" (UniqueName: \"kubernetes.io/projected/cf86866e-8afa-44da-a688-e1c018a025bd-kube-api-access-8xn97\") pod \"metallb-operator-webhook-server-776c7d78bd-jwfh6\" (UID: \"cf86866e-8afa-44da-a688-e1c018a025bd\") " pod="metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.839706 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cf86866e-8afa-44da-a688-e1c018a025bd-apiservice-cert\") pod \"metallb-operator-webhook-server-776c7d78bd-jwfh6\" (UID: \"cf86866e-8afa-44da-a688-e1c018a025bd\") " pod="metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.940940 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cf86866e-8afa-44da-a688-e1c018a025bd-apiservice-cert\") pod \"metallb-operator-webhook-server-776c7d78bd-jwfh6\" (UID: \"cf86866e-8afa-44da-a688-e1c018a025bd\") " pod="metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.941218 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cf86866e-8afa-44da-a688-e1c018a025bd-webhook-cert\") pod \"metallb-operator-webhook-server-776c7d78bd-jwfh6\" (UID: \"cf86866e-8afa-44da-a688-e1c018a025bd\") " pod="metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.941246 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xn97\" (UniqueName: \"kubernetes.io/projected/cf86866e-8afa-44da-a688-e1c018a025bd-kube-api-access-8xn97\") pod \"metallb-operator-webhook-server-776c7d78bd-jwfh6\" (UID: \"cf86866e-8afa-44da-a688-e1c018a025bd\") " pod="metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.948317 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cf86866e-8afa-44da-a688-e1c018a025bd-apiservice-cert\") pod \"metallb-operator-webhook-server-776c7d78bd-jwfh6\" (UID: \"cf86866e-8afa-44da-a688-e1c018a025bd\") " pod="metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.949889 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cf86866e-8afa-44da-a688-e1c018a025bd-webhook-cert\") pod \"metallb-operator-webhook-server-776c7d78bd-jwfh6\" (UID: \"cf86866e-8afa-44da-a688-e1c018a025bd\") " pod="metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.968589 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xn97\" (UniqueName: \"kubernetes.io/projected/cf86866e-8afa-44da-a688-e1c018a025bd-kube-api-access-8xn97\") pod \"metallb-operator-webhook-server-776c7d78bd-jwfh6\" (UID: \"cf86866e-8afa-44da-a688-e1c018a025bd\") " pod="metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6" Mar 01 09:22:38 crc kubenswrapper[4792]: I0301 09:22:38.993993 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz"] Mar 01 09:22:39 crc kubenswrapper[4792]: I0301 09:22:39.013946 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6" Mar 01 09:22:39 crc kubenswrapper[4792]: I0301 09:22:39.336141 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6"] Mar 01 09:22:39 crc kubenswrapper[4792]: W0301 09:22:39.345700 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf86866e_8afa_44da_a688_e1c018a025bd.slice/crio-86d45064458c5acfdb658657c1ebc69cb067763c259b3db068b5bdcabcef2181 WatchSource:0}: Error finding container 86d45064458c5acfdb658657c1ebc69cb067763c259b3db068b5bdcabcef2181: Status 404 returned error can't find the container with id 86d45064458c5acfdb658657c1ebc69cb067763c259b3db068b5bdcabcef2181 Mar 01 09:22:39 crc kubenswrapper[4792]: I0301 09:22:39.514556 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz" event={"ID":"ba22e25a-31e8-4ca7-b169-f7433eda818b","Type":"ContainerStarted","Data":"5eb6ffdc047fa110ebb051125dbd45715246f7404f1693ab496295b8fd7f3faa"} Mar 01 09:22:39 crc kubenswrapper[4792]: I0301 09:22:39.523727 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6" event={"ID":"cf86866e-8afa-44da-a688-e1c018a025bd","Type":"ContainerStarted","Data":"86d45064458c5acfdb658657c1ebc69cb067763c259b3db068b5bdcabcef2181"} Mar 01 09:22:43 crc kubenswrapper[4792]: I0301 09:22:43.571853 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz" event={"ID":"ba22e25a-31e8-4ca7-b169-f7433eda818b","Type":"ContainerStarted","Data":"99f873ea6be94f67a1b492a1dd3b7b43beb066e3994fe56e64d4744d9a32b891"} Mar 01 09:22:43 crc kubenswrapper[4792]: I0301 09:22:43.572505 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz" Mar 01 09:22:45 crc kubenswrapper[4792]: I0301 09:22:45.584114 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6" event={"ID":"cf86866e-8afa-44da-a688-e1c018a025bd","Type":"ContainerStarted","Data":"b2b42b80710f0a209f12adfaf613015c088fbd218de782f286e1ffa0664b4b3d"} Mar 01 09:22:45 crc kubenswrapper[4792]: I0301 09:22:45.585526 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6" Mar 01 09:22:45 crc kubenswrapper[4792]: I0301 09:22:45.600015 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz" podStartSLOduration=3.756940419 podStartE2EDuration="7.59999524s" podCreationTimestamp="2026-03-01 09:22:38 +0000 UTC" firstStartedPulling="2026-03-01 09:22:39.004617284 +0000 UTC m=+888.246496481" lastFinishedPulling="2026-03-01 09:22:42.847672095 +0000 UTC m=+892.089551302" observedRunningTime="2026-03-01 09:22:43.592664552 +0000 UTC m=+892.834543769" watchObservedRunningTime="2026-03-01 09:22:45.59999524 +0000 UTC m=+894.841874437" Mar 01 09:22:45 crc kubenswrapper[4792]: I0301 09:22:45.602318 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6" podStartSLOduration=1.594532041 podStartE2EDuration="7.602301627s" podCreationTimestamp="2026-03-01 09:22:38 +0000 UTC" firstStartedPulling="2026-03-01 09:22:39.349277975 +0000 UTC m=+888.591157172" lastFinishedPulling="2026-03-01 09:22:45.357047561 +0000 UTC m=+894.598926758" observedRunningTime="2026-03-01 09:22:45.598614486 +0000 UTC m=+894.840493693" watchObservedRunningTime="2026-03-01 09:22:45.602301627 +0000 UTC m=+894.844180824" Mar 01 09:22:45 crc kubenswrapper[4792]: I0301 09:22:45.632804 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hw8r6" Mar 01 09:22:45 crc kubenswrapper[4792]: I0301 09:22:45.675326 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hw8r6" Mar 01 09:22:45 crc kubenswrapper[4792]: I0301 09:22:45.868040 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hw8r6"] Mar 01 09:22:47 crc kubenswrapper[4792]: I0301 09:22:47.595007 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hw8r6" podUID="2905773e-c73d-4965-83a8-b1eff758a9b6" containerName="registry-server" containerID="cri-o://cc13c02cfab236dc86f662df07250f98a1840645f3d512661c31a12e8c7b0f08" gracePeriod=2 Mar 01 09:22:47 crc kubenswrapper[4792]: I0301 09:22:47.975767 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hw8r6" Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.071313 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssnqt\" (UniqueName: \"kubernetes.io/projected/2905773e-c73d-4965-83a8-b1eff758a9b6-kube-api-access-ssnqt\") pod \"2905773e-c73d-4965-83a8-b1eff758a9b6\" (UID: \"2905773e-c73d-4965-83a8-b1eff758a9b6\") " Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.071392 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2905773e-c73d-4965-83a8-b1eff758a9b6-utilities\") pod \"2905773e-c73d-4965-83a8-b1eff758a9b6\" (UID: \"2905773e-c73d-4965-83a8-b1eff758a9b6\") " Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.071477 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2905773e-c73d-4965-83a8-b1eff758a9b6-catalog-content\") pod \"2905773e-c73d-4965-83a8-b1eff758a9b6\" (UID: \"2905773e-c73d-4965-83a8-b1eff758a9b6\") " Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.073223 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2905773e-c73d-4965-83a8-b1eff758a9b6-utilities" (OuterVolumeSpecName: "utilities") pod "2905773e-c73d-4965-83a8-b1eff758a9b6" (UID: "2905773e-c73d-4965-83a8-b1eff758a9b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.082346 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2905773e-c73d-4965-83a8-b1eff758a9b6-kube-api-access-ssnqt" (OuterVolumeSpecName: "kube-api-access-ssnqt") pod "2905773e-c73d-4965-83a8-b1eff758a9b6" (UID: "2905773e-c73d-4965-83a8-b1eff758a9b6"). InnerVolumeSpecName "kube-api-access-ssnqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.173330 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssnqt\" (UniqueName: \"kubernetes.io/projected/2905773e-c73d-4965-83a8-b1eff758a9b6-kube-api-access-ssnqt\") on node \"crc\" DevicePath \"\"" Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.173360 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2905773e-c73d-4965-83a8-b1eff758a9b6-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.187572 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2905773e-c73d-4965-83a8-b1eff758a9b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2905773e-c73d-4965-83a8-b1eff758a9b6" (UID: "2905773e-c73d-4965-83a8-b1eff758a9b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.274688 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2905773e-c73d-4965-83a8-b1eff758a9b6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.602424 4792 generic.go:334] "Generic (PLEG): container finished" podID="2905773e-c73d-4965-83a8-b1eff758a9b6" containerID="cc13c02cfab236dc86f662df07250f98a1840645f3d512661c31a12e8c7b0f08" exitCode=0 Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.602468 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hw8r6" event={"ID":"2905773e-c73d-4965-83a8-b1eff758a9b6","Type":"ContainerDied","Data":"cc13c02cfab236dc86f662df07250f98a1840645f3d512661c31a12e8c7b0f08"} Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.602497 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hw8r6" event={"ID":"2905773e-c73d-4965-83a8-b1eff758a9b6","Type":"ContainerDied","Data":"fc49f8e966aa2484369924988e53dd7acfdccbf1699d09f3aa942c9391b8496b"} Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.602517 4792 scope.go:117] "RemoveContainer" containerID="cc13c02cfab236dc86f662df07250f98a1840645f3d512661c31a12e8c7b0f08" Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.602640 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hw8r6" Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.625282 4792 scope.go:117] "RemoveContainer" containerID="f501896b6f9fd28b97abc647b2887daf59ebd8100e2ca875ad728ab54e2ea400" Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.638324 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hw8r6"] Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.650732 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hw8r6"] Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.655034 4792 scope.go:117] "RemoveContainer" containerID="f68e5b68249b6c436545d18ed3e8511507d143daee0d4270a3aa78de08afac15" Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.687589 4792 scope.go:117] "RemoveContainer" containerID="cc13c02cfab236dc86f662df07250f98a1840645f3d512661c31a12e8c7b0f08" Mar 01 09:22:48 crc kubenswrapper[4792]: E0301 09:22:48.688061 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc13c02cfab236dc86f662df07250f98a1840645f3d512661c31a12e8c7b0f08\": container with ID starting with cc13c02cfab236dc86f662df07250f98a1840645f3d512661c31a12e8c7b0f08 not found: ID does not exist" containerID="cc13c02cfab236dc86f662df07250f98a1840645f3d512661c31a12e8c7b0f08" Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.688099 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc13c02cfab236dc86f662df07250f98a1840645f3d512661c31a12e8c7b0f08"} err="failed to get container status \"cc13c02cfab236dc86f662df07250f98a1840645f3d512661c31a12e8c7b0f08\": rpc error: code = NotFound desc = could not find container \"cc13c02cfab236dc86f662df07250f98a1840645f3d512661c31a12e8c7b0f08\": container with ID starting with cc13c02cfab236dc86f662df07250f98a1840645f3d512661c31a12e8c7b0f08 not found: ID does not exist" Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.688121 4792 scope.go:117] "RemoveContainer" containerID="f501896b6f9fd28b97abc647b2887daf59ebd8100e2ca875ad728ab54e2ea400" Mar 01 09:22:48 crc kubenswrapper[4792]: E0301 09:22:48.688438 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f501896b6f9fd28b97abc647b2887daf59ebd8100e2ca875ad728ab54e2ea400\": container with ID starting with f501896b6f9fd28b97abc647b2887daf59ebd8100e2ca875ad728ab54e2ea400 not found: ID does not exist" containerID="f501896b6f9fd28b97abc647b2887daf59ebd8100e2ca875ad728ab54e2ea400" Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.688460 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f501896b6f9fd28b97abc647b2887daf59ebd8100e2ca875ad728ab54e2ea400"} err="failed to get container status \"f501896b6f9fd28b97abc647b2887daf59ebd8100e2ca875ad728ab54e2ea400\": rpc error: code = NotFound desc = could not find container \"f501896b6f9fd28b97abc647b2887daf59ebd8100e2ca875ad728ab54e2ea400\": container with ID starting with f501896b6f9fd28b97abc647b2887daf59ebd8100e2ca875ad728ab54e2ea400 not found: ID does not exist" Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.688486 4792 scope.go:117] "RemoveContainer" containerID="f68e5b68249b6c436545d18ed3e8511507d143daee0d4270a3aa78de08afac15" Mar 01 09:22:48 crc kubenswrapper[4792]: E0301 09:22:48.688754 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f68e5b68249b6c436545d18ed3e8511507d143daee0d4270a3aa78de08afac15\": container with ID starting with f68e5b68249b6c436545d18ed3e8511507d143daee0d4270a3aa78de08afac15 not found: ID does not exist" containerID="f68e5b68249b6c436545d18ed3e8511507d143daee0d4270a3aa78de08afac15" Mar 01 09:22:48 crc kubenswrapper[4792]: I0301 09:22:48.688777 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f68e5b68249b6c436545d18ed3e8511507d143daee0d4270a3aa78de08afac15"} err="failed to get container status \"f68e5b68249b6c436545d18ed3e8511507d143daee0d4270a3aa78de08afac15\": rpc error: code = NotFound desc = could not find container \"f68e5b68249b6c436545d18ed3e8511507d143daee0d4270a3aa78de08afac15\": container with ID starting with f68e5b68249b6c436545d18ed3e8511507d143daee0d4270a3aa78de08afac15 not found: ID does not exist" Mar 01 09:22:49 crc kubenswrapper[4792]: I0301 09:22:49.416497 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2905773e-c73d-4965-83a8-b1eff758a9b6" path="/var/lib/kubelet/pods/2905773e-c73d-4965-83a8-b1eff758a9b6/volumes" Mar 01 09:22:52 crc kubenswrapper[4792]: I0301 09:22:52.858467 4792 scope.go:117] "RemoveContainer" containerID="7f82c367589f33dd358f1e6a48f6206b0470bf80f8be1de16c8420482e80dba1" Mar 01 09:22:59 crc kubenswrapper[4792]: I0301 09:22:59.019832 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-776c7d78bd-jwfh6" Mar 01 09:23:04 crc kubenswrapper[4792]: I0301 09:23:04.943114 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:23:04 crc kubenswrapper[4792]: I0301 09:23:04.943588 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:23:18 crc kubenswrapper[4792]: I0301 09:23:18.639484 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5cd84fcfbc-lrpmz" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.457539 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-fjh95"] Mar 01 09:23:19 crc kubenswrapper[4792]: E0301 09:23:19.458206 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2905773e-c73d-4965-83a8-b1eff758a9b6" containerName="extract-content" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.458221 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2905773e-c73d-4965-83a8-b1eff758a9b6" containerName="extract-content" Mar 01 09:23:19 crc kubenswrapper[4792]: E0301 09:23:19.458248 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2905773e-c73d-4965-83a8-b1eff758a9b6" containerName="extract-utilities" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.458255 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2905773e-c73d-4965-83a8-b1eff758a9b6" containerName="extract-utilities" Mar 01 09:23:19 crc kubenswrapper[4792]: E0301 09:23:19.458263 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2905773e-c73d-4965-83a8-b1eff758a9b6" containerName="registry-server" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.458269 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2905773e-c73d-4965-83a8-b1eff758a9b6" containerName="registry-server" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.458363 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2905773e-c73d-4965-83a8-b1eff758a9b6" containerName="registry-server" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.460254 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.462510 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/53127911-b831-4b3a-816d-ff8271118244-reloader\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.462633 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/53127911-b831-4b3a-816d-ff8271118244-metrics\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.462793 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/53127911-b831-4b3a-816d-ff8271118244-frr-sockets\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.462867 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/53127911-b831-4b3a-816d-ff8271118244-frr-startup\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.462990 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttvlq\" (UniqueName: \"kubernetes.io/projected/53127911-b831-4b3a-816d-ff8271118244-kube-api-access-ttvlq\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.463212 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53127911-b831-4b3a-816d-ff8271118244-metrics-certs\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.463306 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/53127911-b831-4b3a-816d-ff8271118244-frr-conf\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.465267 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-ml45x" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.469995 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-kfnzk"] Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.470409 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.470476 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.470747 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-kfnzk" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.473874 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.490537 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-kfnzk"] Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.566463 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/53127911-b831-4b3a-816d-ff8271118244-frr-sockets\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.566711 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/53127911-b831-4b3a-816d-ff8271118244-frr-startup\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.566755 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttvlq\" (UniqueName: \"kubernetes.io/projected/53127911-b831-4b3a-816d-ff8271118244-kube-api-access-ttvlq\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.566790 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53127911-b831-4b3a-816d-ff8271118244-metrics-certs\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.566813 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/53127911-b831-4b3a-816d-ff8271118244-frr-conf\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.566859 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/53127911-b831-4b3a-816d-ff8271118244-reloader\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.566874 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/53127911-b831-4b3a-816d-ff8271118244-metrics\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.566938 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/53127911-b831-4b3a-816d-ff8271118244-frr-sockets\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: E0301 09:23:19.567041 4792 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 01 09:23:19 crc kubenswrapper[4792]: E0301 09:23:19.567089 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53127911-b831-4b3a-816d-ff8271118244-metrics-certs podName:53127911-b831-4b3a-816d-ff8271118244 nodeName:}" failed. No retries permitted until 2026-03-01 09:23:20.067074566 +0000 UTC m=+929.308953763 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/53127911-b831-4b3a-816d-ff8271118244-metrics-certs") pod "frr-k8s-fjh95" (UID: "53127911-b831-4b3a-816d-ff8271118244") : secret "frr-k8s-certs-secret" not found Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.567200 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/53127911-b831-4b3a-816d-ff8271118244-metrics\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.567380 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/53127911-b831-4b3a-816d-ff8271118244-frr-conf\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.568413 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/53127911-b831-4b3a-816d-ff8271118244-frr-startup\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.568538 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/53127911-b831-4b3a-816d-ff8271118244-reloader\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.579696 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-zpr27"] Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.580769 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-zpr27" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.592148 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-v7jnd" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.592313 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.592431 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.592536 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.606827 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttvlq\" (UniqueName: \"kubernetes.io/projected/53127911-b831-4b3a-816d-ff8271118244-kube-api-access-ttvlq\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.608225 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-twxml"] Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.609021 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-twxml" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.618450 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.643738 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-twxml"] Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.675730 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2f0572c-e661-495c-873c-6e2d18f2ab7d-cert\") pod \"frr-k8s-webhook-server-7f989f654f-kfnzk\" (UID: \"d2f0572c-e661-495c-873c-6e2d18f2ab7d\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-kfnzk" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.675777 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vmsk\" (UniqueName: \"kubernetes.io/projected/d2f0572c-e661-495c-873c-6e2d18f2ab7d-kube-api-access-2vmsk\") pod \"frr-k8s-webhook-server-7f989f654f-kfnzk\" (UID: \"d2f0572c-e661-495c-873c-6e2d18f2ab7d\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-kfnzk" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.777309 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-metrics-certs\") pod \"speaker-zpr27\" (UID: \"8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7\") " pod="metallb-system/speaker-zpr27" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.777377 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2f0572c-e661-495c-873c-6e2d18f2ab7d-cert\") pod \"frr-k8s-webhook-server-7f989f654f-kfnzk\" (UID: \"d2f0572c-e661-495c-873c-6e2d18f2ab7d\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-kfnzk" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.777397 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vmsk\" (UniqueName: \"kubernetes.io/projected/d2f0572c-e661-495c-873c-6e2d18f2ab7d-kube-api-access-2vmsk\") pod \"frr-k8s-webhook-server-7f989f654f-kfnzk\" (UID: \"d2f0572c-e661-495c-873c-6e2d18f2ab7d\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-kfnzk" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.777418 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-metallb-excludel2\") pod \"speaker-zpr27\" (UID: \"8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7\") " pod="metallb-system/speaker-zpr27" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.777441 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f73a6813-31ea-4018-bd23-45bf2f1dfe89-metrics-certs\") pod \"controller-86ddb6bd46-twxml\" (UID: \"f73a6813-31ea-4018-bd23-45bf2f1dfe89\") " pod="metallb-system/controller-86ddb6bd46-twxml" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.777461 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f73a6813-31ea-4018-bd23-45bf2f1dfe89-cert\") pod \"controller-86ddb6bd46-twxml\" (UID: \"f73a6813-31ea-4018-bd23-45bf2f1dfe89\") " pod="metallb-system/controller-86ddb6bd46-twxml" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.777481 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-memberlist\") pod \"speaker-zpr27\" (UID: \"8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7\") " pod="metallb-system/speaker-zpr27" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.777500 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n92z9\" (UniqueName: \"kubernetes.io/projected/f73a6813-31ea-4018-bd23-45bf2f1dfe89-kube-api-access-n92z9\") pod \"controller-86ddb6bd46-twxml\" (UID: \"f73a6813-31ea-4018-bd23-45bf2f1dfe89\") " pod="metallb-system/controller-86ddb6bd46-twxml" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.777525 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkrfz\" (UniqueName: \"kubernetes.io/projected/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-kube-api-access-mkrfz\") pod \"speaker-zpr27\" (UID: \"8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7\") " pod="metallb-system/speaker-zpr27" Mar 01 09:23:19 crc kubenswrapper[4792]: E0301 09:23:19.777528 4792 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 01 09:23:19 crc kubenswrapper[4792]: E0301 09:23:19.777601 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2f0572c-e661-495c-873c-6e2d18f2ab7d-cert podName:d2f0572c-e661-495c-873c-6e2d18f2ab7d nodeName:}" failed. No retries permitted until 2026-03-01 09:23:20.277583901 +0000 UTC m=+929.519463098 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d2f0572c-e661-495c-873c-6e2d18f2ab7d-cert") pod "frr-k8s-webhook-server-7f989f654f-kfnzk" (UID: "d2f0572c-e661-495c-873c-6e2d18f2ab7d") : secret "frr-k8s-webhook-server-cert" not found Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.813692 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vmsk\" (UniqueName: \"kubernetes.io/projected/d2f0572c-e661-495c-873c-6e2d18f2ab7d-kube-api-access-2vmsk\") pod \"frr-k8s-webhook-server-7f989f654f-kfnzk\" (UID: \"d2f0572c-e661-495c-873c-6e2d18f2ab7d\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-kfnzk" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.878690 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-metrics-certs\") pod \"speaker-zpr27\" (UID: \"8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7\") " pod="metallb-system/speaker-zpr27" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.878773 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-metallb-excludel2\") pod \"speaker-zpr27\" (UID: \"8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7\") " pod="metallb-system/speaker-zpr27" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.878799 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f73a6813-31ea-4018-bd23-45bf2f1dfe89-metrics-certs\") pod \"controller-86ddb6bd46-twxml\" (UID: \"f73a6813-31ea-4018-bd23-45bf2f1dfe89\") " pod="metallb-system/controller-86ddb6bd46-twxml" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.878817 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f73a6813-31ea-4018-bd23-45bf2f1dfe89-cert\") pod \"controller-86ddb6bd46-twxml\" (UID: \"f73a6813-31ea-4018-bd23-45bf2f1dfe89\") " pod="metallb-system/controller-86ddb6bd46-twxml" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.878836 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-memberlist\") pod \"speaker-zpr27\" (UID: \"8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7\") " pod="metallb-system/speaker-zpr27" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.878853 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n92z9\" (UniqueName: \"kubernetes.io/projected/f73a6813-31ea-4018-bd23-45bf2f1dfe89-kube-api-access-n92z9\") pod \"controller-86ddb6bd46-twxml\" (UID: \"f73a6813-31ea-4018-bd23-45bf2f1dfe89\") " pod="metallb-system/controller-86ddb6bd46-twxml" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.878877 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkrfz\" (UniqueName: \"kubernetes.io/projected/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-kube-api-access-mkrfz\") pod \"speaker-zpr27\" (UID: \"8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7\") " pod="metallb-system/speaker-zpr27" Mar 01 09:23:19 crc kubenswrapper[4792]: E0301 09:23:19.879250 4792 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 01 09:23:19 crc kubenswrapper[4792]: E0301 09:23:19.879298 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-metrics-certs podName:8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7 nodeName:}" failed. No retries permitted until 2026-03-01 09:23:20.379286246 +0000 UTC m=+929.621165443 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-metrics-certs") pod "speaker-zpr27" (UID: "8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7") : secret "speaker-certs-secret" not found Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.879968 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-metallb-excludel2\") pod \"speaker-zpr27\" (UID: \"8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7\") " pod="metallb-system/speaker-zpr27" Mar 01 09:23:19 crc kubenswrapper[4792]: E0301 09:23:19.880029 4792 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 01 09:23:19 crc kubenswrapper[4792]: E0301 09:23:19.880054 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f73a6813-31ea-4018-bd23-45bf2f1dfe89-metrics-certs podName:f73a6813-31ea-4018-bd23-45bf2f1dfe89 nodeName:}" failed. No retries permitted until 2026-03-01 09:23:20.380046085 +0000 UTC m=+929.621925282 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f73a6813-31ea-4018-bd23-45bf2f1dfe89-metrics-certs") pod "controller-86ddb6bd46-twxml" (UID: "f73a6813-31ea-4018-bd23-45bf2f1dfe89") : secret "controller-certs-secret" not found Mar 01 09:23:19 crc kubenswrapper[4792]: E0301 09:23:19.880117 4792 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 01 09:23:19 crc kubenswrapper[4792]: E0301 09:23:19.880138 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-memberlist podName:8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7 nodeName:}" failed. No retries permitted until 2026-03-01 09:23:20.380131267 +0000 UTC m=+929.622010454 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-memberlist") pod "speaker-zpr27" (UID: "8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7") : secret "metallb-memberlist" not found Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.882614 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.901534 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f73a6813-31ea-4018-bd23-45bf2f1dfe89-cert\") pod \"controller-86ddb6bd46-twxml\" (UID: \"f73a6813-31ea-4018-bd23-45bf2f1dfe89\") " pod="metallb-system/controller-86ddb6bd46-twxml" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.905008 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n92z9\" (UniqueName: \"kubernetes.io/projected/f73a6813-31ea-4018-bd23-45bf2f1dfe89-kube-api-access-n92z9\") pod \"controller-86ddb6bd46-twxml\" (UID: \"f73a6813-31ea-4018-bd23-45bf2f1dfe89\") " pod="metallb-system/controller-86ddb6bd46-twxml" Mar 01 09:23:19 crc kubenswrapper[4792]: I0301 09:23:19.905736 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkrfz\" (UniqueName: \"kubernetes.io/projected/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-kube-api-access-mkrfz\") pod \"speaker-zpr27\" (UID: \"8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7\") " pod="metallb-system/speaker-zpr27" Mar 01 09:23:20 crc kubenswrapper[4792]: I0301 09:23:20.081173 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53127911-b831-4b3a-816d-ff8271118244-metrics-certs\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:20 crc kubenswrapper[4792]: I0301 09:23:20.091449 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53127911-b831-4b3a-816d-ff8271118244-metrics-certs\") pod \"frr-k8s-fjh95\" (UID: \"53127911-b831-4b3a-816d-ff8271118244\") " pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:20 crc kubenswrapper[4792]: I0301 09:23:20.283087 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2f0572c-e661-495c-873c-6e2d18f2ab7d-cert\") pod \"frr-k8s-webhook-server-7f989f654f-kfnzk\" (UID: \"d2f0572c-e661-495c-873c-6e2d18f2ab7d\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-kfnzk" Mar 01 09:23:20 crc kubenswrapper[4792]: I0301 09:23:20.285766 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2f0572c-e661-495c-873c-6e2d18f2ab7d-cert\") pod \"frr-k8s-webhook-server-7f989f654f-kfnzk\" (UID: \"d2f0572c-e661-495c-873c-6e2d18f2ab7d\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-kfnzk" Mar 01 09:23:20 crc kubenswrapper[4792]: I0301 09:23:20.383170 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:20 crc kubenswrapper[4792]: I0301 09:23:20.384114 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-metrics-certs\") pod \"speaker-zpr27\" (UID: \"8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7\") " pod="metallb-system/speaker-zpr27" Mar 01 09:23:20 crc kubenswrapper[4792]: I0301 09:23:20.385123 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f73a6813-31ea-4018-bd23-45bf2f1dfe89-metrics-certs\") pod \"controller-86ddb6bd46-twxml\" (UID: \"f73a6813-31ea-4018-bd23-45bf2f1dfe89\") " pod="metallb-system/controller-86ddb6bd46-twxml" Mar 01 09:23:20 crc kubenswrapper[4792]: I0301 09:23:20.385182 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-memberlist\") pod \"speaker-zpr27\" (UID: \"8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7\") " pod="metallb-system/speaker-zpr27" Mar 01 09:23:20 crc kubenswrapper[4792]: E0301 09:23:20.385334 4792 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 01 09:23:20 crc kubenswrapper[4792]: E0301 09:23:20.385400 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-memberlist podName:8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7 nodeName:}" failed. No retries permitted until 2026-03-01 09:23:21.385376013 +0000 UTC m=+930.627255240 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-memberlist") pod "speaker-zpr27" (UID: "8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7") : secret "metallb-memberlist" not found Mar 01 09:23:20 crc kubenswrapper[4792]: I0301 09:23:20.387832 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f73a6813-31ea-4018-bd23-45bf2f1dfe89-metrics-certs\") pod \"controller-86ddb6bd46-twxml\" (UID: \"f73a6813-31ea-4018-bd23-45bf2f1dfe89\") " pod="metallb-system/controller-86ddb6bd46-twxml" Mar 01 09:23:20 crc kubenswrapper[4792]: I0301 09:23:20.387999 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-metrics-certs\") pod \"speaker-zpr27\" (UID: \"8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7\") " pod="metallb-system/speaker-zpr27" Mar 01 09:23:20 crc kubenswrapper[4792]: I0301 09:23:20.394206 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-kfnzk" Mar 01 09:23:20 crc kubenswrapper[4792]: I0301 09:23:20.501297 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 01 09:23:20 crc kubenswrapper[4792]: I0301 09:23:20.558437 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-twxml" Mar 01 09:23:20 crc kubenswrapper[4792]: I0301 09:23:20.753888 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-twxml"] Mar 01 09:23:20 crc kubenswrapper[4792]: W0301 09:23:20.760655 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf73a6813_31ea_4018_bd23_45bf2f1dfe89.slice/crio-dd7b7cadbd0b299fb1c2a072488f1ceccc61669b5710fd67105da12d92e4f954 WatchSource:0}: Error finding container dd7b7cadbd0b299fb1c2a072488f1ceccc61669b5710fd67105da12d92e4f954: Status 404 returned error can't find the container with id dd7b7cadbd0b299fb1c2a072488f1ceccc61669b5710fd67105da12d92e4f954 Mar 01 09:23:20 crc kubenswrapper[4792]: I0301 09:23:20.798641 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-kfnzk"] Mar 01 09:23:20 crc kubenswrapper[4792]: I0301 09:23:20.822389 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-twxml" event={"ID":"f73a6813-31ea-4018-bd23-45bf2f1dfe89","Type":"ContainerStarted","Data":"dd7b7cadbd0b299fb1c2a072488f1ceccc61669b5710fd67105da12d92e4f954"} Mar 01 09:23:20 crc kubenswrapper[4792]: I0301 09:23:20.823202 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fjh95" event={"ID":"53127911-b831-4b3a-816d-ff8271118244","Type":"ContainerStarted","Data":"f74d28b794be22820b963fd68426650c305ba12ef9c68cb95e15c544960c5fdb"} Mar 01 09:23:20 crc kubenswrapper[4792]: I0301 09:23:20.825574 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-kfnzk" event={"ID":"d2f0572c-e661-495c-873c-6e2d18f2ab7d","Type":"ContainerStarted","Data":"f943325d04a20f53a0d112a1c731bc17eededc9d94b599cb138af76e39e12202"} Mar 01 09:23:21 crc kubenswrapper[4792]: I0301 09:23:21.395992 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-memberlist\") pod \"speaker-zpr27\" (UID: \"8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7\") " pod="metallb-system/speaker-zpr27" Mar 01 09:23:21 crc kubenswrapper[4792]: I0301 09:23:21.401841 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7-memberlist\") pod \"speaker-zpr27\" (UID: \"8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7\") " pod="metallb-system/speaker-zpr27" Mar 01 09:23:21 crc kubenswrapper[4792]: I0301 09:23:21.692825 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-zpr27" Mar 01 09:23:21 crc kubenswrapper[4792]: I0301 09:23:21.858298 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-twxml" event={"ID":"f73a6813-31ea-4018-bd23-45bf2f1dfe89","Type":"ContainerStarted","Data":"eabe384ac232e5735ff297d62687099b4e73e91c340b530929a4cae49694055d"} Mar 01 09:23:21 crc kubenswrapper[4792]: I0301 09:23:21.860215 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-twxml" event={"ID":"f73a6813-31ea-4018-bd23-45bf2f1dfe89","Type":"ContainerStarted","Data":"713ee522ef6f3709439e71064ddec2c9754005218fadb3586ec288372a43e550"} Mar 01 09:23:21 crc kubenswrapper[4792]: I0301 09:23:21.860349 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-twxml" Mar 01 09:23:21 crc kubenswrapper[4792]: I0301 09:23:21.876325 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zpr27" event={"ID":"8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7","Type":"ContainerStarted","Data":"842e7a6e0b3d1e46c10b1ce8dee51b18771e54a186215e7f33679d56b4259e47"} Mar 01 09:23:21 crc kubenswrapper[4792]: I0301 09:23:21.923347 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-twxml" podStartSLOduration=2.9233276249999998 podStartE2EDuration="2.923327625s" podCreationTimestamp="2026-03-01 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:23:21.912472149 +0000 UTC m=+931.154351346" watchObservedRunningTime="2026-03-01 09:23:21.923327625 +0000 UTC m=+931.165206822" Mar 01 09:23:22 crc kubenswrapper[4792]: I0301 09:23:22.890528 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zpr27" event={"ID":"8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7","Type":"ContainerStarted","Data":"08ee59ceb76dc5ca9b8375a9ccea179ce981cbe1aec845042bbc4d3e1d1e9628"} Mar 01 09:23:22 crc kubenswrapper[4792]: I0301 09:23:22.890578 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zpr27" event={"ID":"8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7","Type":"ContainerStarted","Data":"2f3df5e44eb8abc0a9ba2ca75ed28771e6a184e173622de84ef7ac03adf5f95d"} Mar 01 09:23:22 crc kubenswrapper[4792]: I0301 09:23:22.910877 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-zpr27" podStartSLOduration=3.910858755 podStartE2EDuration="3.910858755s" podCreationTimestamp="2026-03-01 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:23:22.906522988 +0000 UTC m=+932.148402185" watchObservedRunningTime="2026-03-01 09:23:22.910858755 +0000 UTC m=+932.152737952" Mar 01 09:23:23 crc kubenswrapper[4792]: I0301 09:23:23.896468 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-zpr27" Mar 01 09:23:27 crc kubenswrapper[4792]: I0301 09:23:27.927190 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-kfnzk" event={"ID":"d2f0572c-e661-495c-873c-6e2d18f2ab7d","Type":"ContainerStarted","Data":"9856058333ce3fd85c0ea840c995e331a0fed9a49e9cee3fdfba8075b2489ea6"} Mar 01 09:23:27 crc kubenswrapper[4792]: I0301 09:23:27.928752 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-kfnzk" Mar 01 09:23:27 crc kubenswrapper[4792]: I0301 09:23:27.930600 4792 generic.go:334] "Generic (PLEG): container finished" podID="53127911-b831-4b3a-816d-ff8271118244" containerID="2849c13ea036c394232db22832c7b8a847417cc540e5e50969a2f5e3d8c2e8ce" exitCode=0 Mar 01 09:23:27 crc kubenswrapper[4792]: I0301 09:23:27.930653 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fjh95" event={"ID":"53127911-b831-4b3a-816d-ff8271118244","Type":"ContainerDied","Data":"2849c13ea036c394232db22832c7b8a847417cc540e5e50969a2f5e3d8c2e8ce"} Mar 01 09:23:27 crc kubenswrapper[4792]: I0301 09:23:27.984457 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-kfnzk" podStartSLOduration=2.042168546 podStartE2EDuration="8.984438153s" podCreationTimestamp="2026-03-01 09:23:19 +0000 UTC" firstStartedPulling="2026-03-01 09:23:20.809883327 +0000 UTC m=+930.051762524" lastFinishedPulling="2026-03-01 09:23:27.752152934 +0000 UTC m=+936.994032131" observedRunningTime="2026-03-01 09:23:27.951117095 +0000 UTC m=+937.192996292" watchObservedRunningTime="2026-03-01 09:23:27.984438153 +0000 UTC m=+937.226317350" Mar 01 09:23:28 crc kubenswrapper[4792]: I0301 09:23:28.937633 4792 generic.go:334] "Generic (PLEG): container finished" podID="53127911-b831-4b3a-816d-ff8271118244" containerID="aa4c9d8df43a35764346935e48866f5131887b7f2a4eb9631d0df7fdf5b57e5e" exitCode=0 Mar 01 09:23:28 crc kubenswrapper[4792]: I0301 09:23:28.937722 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fjh95" event={"ID":"53127911-b831-4b3a-816d-ff8271118244","Type":"ContainerDied","Data":"aa4c9d8df43a35764346935e48866f5131887b7f2a4eb9631d0df7fdf5b57e5e"} Mar 01 09:23:29 crc kubenswrapper[4792]: I0301 09:23:29.945824 4792 generic.go:334] "Generic (PLEG): container finished" podID="53127911-b831-4b3a-816d-ff8271118244" containerID="c5567badd19bd0774a5879cb60e83c9e9efe133a039626364b25b32420a0fbd9" exitCode=0 Mar 01 09:23:29 crc kubenswrapper[4792]: I0301 09:23:29.946174 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fjh95" event={"ID":"53127911-b831-4b3a-816d-ff8271118244","Type":"ContainerDied","Data":"c5567badd19bd0774a5879cb60e83c9e9efe133a039626364b25b32420a0fbd9"} Mar 01 09:23:30 crc kubenswrapper[4792]: I0301 09:23:30.576585 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-twxml" Mar 01 09:23:30 crc kubenswrapper[4792]: I0301 09:23:30.963448 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fjh95" event={"ID":"53127911-b831-4b3a-816d-ff8271118244","Type":"ContainerStarted","Data":"34e4bb559cfb9d6e311c8dfbe2b6e322b66c7cef954dce8387c2f2aea3eda5bd"} Mar 01 09:23:30 crc kubenswrapper[4792]: I0301 09:23:30.963489 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fjh95" event={"ID":"53127911-b831-4b3a-816d-ff8271118244","Type":"ContainerStarted","Data":"dfc24e756e4ba7fa7e3d2587c6a5e57d3eafe42995d193b132e5e55c76946896"} Mar 01 09:23:30 crc kubenswrapper[4792]: I0301 09:23:30.963502 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fjh95" event={"ID":"53127911-b831-4b3a-816d-ff8271118244","Type":"ContainerStarted","Data":"26f91c5c157dae44d9ef6b2752ae51f2a09dfb570bcddcd67b7b331f9f89f181"} Mar 01 09:23:30 crc kubenswrapper[4792]: I0301 09:23:30.963512 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fjh95" event={"ID":"53127911-b831-4b3a-816d-ff8271118244","Type":"ContainerStarted","Data":"90be616421c9f3419a458a1fef2213ebed5c52b4aa4b8799bed8863dac9370f9"} Mar 01 09:23:30 crc kubenswrapper[4792]: I0301 09:23:30.963522 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fjh95" event={"ID":"53127911-b831-4b3a-816d-ff8271118244","Type":"ContainerStarted","Data":"4afebf9e80babf7e58e9ef7416b26544ff34766a1d654b36f2e4854e21aa3036"} Mar 01 09:23:31 crc kubenswrapper[4792]: I0301 09:23:31.973296 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fjh95" event={"ID":"53127911-b831-4b3a-816d-ff8271118244","Type":"ContainerStarted","Data":"f5a0c46df22e1e5bf9fd8e6d06a9d81d9a441358efdd9259aa7325c336c66cf0"} Mar 01 09:23:31 crc kubenswrapper[4792]: I0301 09:23:31.973591 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:31 crc kubenswrapper[4792]: I0301 09:23:31.995491 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-fjh95" podStartSLOduration=5.726734755 podStartE2EDuration="12.995470342s" podCreationTimestamp="2026-03-01 09:23:19 +0000 UTC" firstStartedPulling="2026-03-01 09:23:20.50107341 +0000 UTC m=+929.742952607" lastFinishedPulling="2026-03-01 09:23:27.769809007 +0000 UTC m=+937.011688194" observedRunningTime="2026-03-01 09:23:31.991042363 +0000 UTC m=+941.232921560" watchObservedRunningTime="2026-03-01 09:23:31.995470342 +0000 UTC m=+941.237349529" Mar 01 09:23:34 crc kubenswrapper[4792]: I0301 09:23:34.942577 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:23:34 crc kubenswrapper[4792]: I0301 09:23:34.942822 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:23:34 crc kubenswrapper[4792]: I0301 09:23:34.942855 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:23:34 crc kubenswrapper[4792]: I0301 09:23:34.943584 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f223d76bf80d673696c61fa09c13cc363c202a24d360c9e2da1e52c335e05521"} pod="openshift-machine-config-operator/machine-config-daemon-bqszv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 01 09:23:34 crc kubenswrapper[4792]: I0301 09:23:34.943637 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" containerID="cri-o://f223d76bf80d673696c61fa09c13cc363c202a24d360c9e2da1e52c335e05521" gracePeriod=600 Mar 01 09:23:35 crc kubenswrapper[4792]: I0301 09:23:35.384093 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:35 crc kubenswrapper[4792]: I0301 09:23:35.439994 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:36 crc kubenswrapper[4792]: I0301 09:23:36.001157 4792 generic.go:334] "Generic (PLEG): container finished" podID="9105f6b0-6f16-47aa-8009-73736a90b765" containerID="f223d76bf80d673696c61fa09c13cc363c202a24d360c9e2da1e52c335e05521" exitCode=0 Mar 01 09:23:36 crc kubenswrapper[4792]: I0301 09:23:36.002242 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerDied","Data":"f223d76bf80d673696c61fa09c13cc363c202a24d360c9e2da1e52c335e05521"} Mar 01 09:23:36 crc kubenswrapper[4792]: I0301 09:23:36.002355 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"9fc8b9702d9d3591695478729a2a209996e2f83219ba8649a31afc02f286ad3f"} Mar 01 09:23:36 crc kubenswrapper[4792]: I0301 09:23:36.002448 4792 scope.go:117] "RemoveContainer" containerID="3d6c435219d8bc835f1f13dc6f334e1ca86fd764226b575caaadee4dad9e5209" Mar 01 09:23:40 crc kubenswrapper[4792]: I0301 09:23:40.386007 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-fjh95" Mar 01 09:23:40 crc kubenswrapper[4792]: I0301 09:23:40.398803 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-kfnzk" Mar 01 09:23:41 crc kubenswrapper[4792]: I0301 09:23:41.698391 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-zpr27" Mar 01 09:23:44 crc kubenswrapper[4792]: I0301 09:23:44.342619 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-wsvzs"] Mar 01 09:23:44 crc kubenswrapper[4792]: I0301 09:23:44.344049 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wsvzs" Mar 01 09:23:44 crc kubenswrapper[4792]: I0301 09:23:44.347764 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 01 09:23:44 crc kubenswrapper[4792]: I0301 09:23:44.347778 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-h4vjh" Mar 01 09:23:44 crc kubenswrapper[4792]: I0301 09:23:44.351457 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 01 09:23:44 crc kubenswrapper[4792]: I0301 09:23:44.392085 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn9gn\" (UniqueName: \"kubernetes.io/projected/00b31e3f-8443-487a-916a-59ec98ccd4de-kube-api-access-vn9gn\") pod \"openstack-operator-index-wsvzs\" (UID: \"00b31e3f-8443-487a-916a-59ec98ccd4de\") " pod="openstack-operators/openstack-operator-index-wsvzs" Mar 01 09:23:44 crc kubenswrapper[4792]: I0301 09:23:44.395434 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wsvzs"] Mar 01 09:23:44 crc kubenswrapper[4792]: I0301 09:23:44.493244 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn9gn\" (UniqueName: \"kubernetes.io/projected/00b31e3f-8443-487a-916a-59ec98ccd4de-kube-api-access-vn9gn\") pod \"openstack-operator-index-wsvzs\" (UID: \"00b31e3f-8443-487a-916a-59ec98ccd4de\") " pod="openstack-operators/openstack-operator-index-wsvzs" Mar 01 09:23:44 crc kubenswrapper[4792]: I0301 09:23:44.512529 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn9gn\" (UniqueName: \"kubernetes.io/projected/00b31e3f-8443-487a-916a-59ec98ccd4de-kube-api-access-vn9gn\") pod \"openstack-operator-index-wsvzs\" (UID: \"00b31e3f-8443-487a-916a-59ec98ccd4de\") " pod="openstack-operators/openstack-operator-index-wsvzs" Mar 01 09:23:44 crc kubenswrapper[4792]: I0301 09:23:44.663810 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wsvzs" Mar 01 09:23:44 crc kubenswrapper[4792]: I0301 09:23:44.879436 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wsvzs"] Mar 01 09:23:45 crc kubenswrapper[4792]: I0301 09:23:45.057375 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wsvzs" event={"ID":"00b31e3f-8443-487a-916a-59ec98ccd4de","Type":"ContainerStarted","Data":"e98ee9c3b90fa09d03524b35e6a99ec6a5872d1dba45dc8484cb95ef1f21e4e3"} Mar 01 09:23:47 crc kubenswrapper[4792]: I0301 09:23:47.069797 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wsvzs" event={"ID":"00b31e3f-8443-487a-916a-59ec98ccd4de","Type":"ContainerStarted","Data":"059995c34f2b45f2d80e005bcf0169eacc91be21aa2b7c23d727bfaab5828afe"} Mar 01 09:23:47 crc kubenswrapper[4792]: I0301 09:23:47.083411 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-wsvzs" podStartSLOduration=2.020652226 podStartE2EDuration="3.083392569s" podCreationTimestamp="2026-03-01 09:23:44 +0000 UTC" firstStartedPulling="2026-03-01 09:23:44.88775647 +0000 UTC m=+954.129635667" lastFinishedPulling="2026-03-01 09:23:45.950496813 +0000 UTC m=+955.192376010" observedRunningTime="2026-03-01 09:23:47.082436895 +0000 UTC m=+956.324316102" watchObservedRunningTime="2026-03-01 09:23:47.083392569 +0000 UTC m=+956.325271776" Mar 01 09:23:47 crc kubenswrapper[4792]: I0301 09:23:47.703307 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-wsvzs"] Mar 01 09:23:48 crc kubenswrapper[4792]: I0301 09:23:48.311828 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-5kfk4"] Mar 01 09:23:48 crc kubenswrapper[4792]: I0301 09:23:48.312694 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5kfk4" Mar 01 09:23:48 crc kubenswrapper[4792]: I0301 09:23:48.325012 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5kfk4"] Mar 01 09:23:48 crc kubenswrapper[4792]: I0301 09:23:48.437381 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hllql\" (UniqueName: \"kubernetes.io/projected/dc22117a-72a7-4838-bb1c-111e91514b98-kube-api-access-hllql\") pod \"openstack-operator-index-5kfk4\" (UID: \"dc22117a-72a7-4838-bb1c-111e91514b98\") " pod="openstack-operators/openstack-operator-index-5kfk4" Mar 01 09:23:48 crc kubenswrapper[4792]: I0301 09:23:48.538531 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hllql\" (UniqueName: \"kubernetes.io/projected/dc22117a-72a7-4838-bb1c-111e91514b98-kube-api-access-hllql\") pod \"openstack-operator-index-5kfk4\" (UID: \"dc22117a-72a7-4838-bb1c-111e91514b98\") " pod="openstack-operators/openstack-operator-index-5kfk4" Mar 01 09:23:48 crc kubenswrapper[4792]: I0301 09:23:48.558011 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hllql\" (UniqueName: \"kubernetes.io/projected/dc22117a-72a7-4838-bb1c-111e91514b98-kube-api-access-hllql\") pod \"openstack-operator-index-5kfk4\" (UID: \"dc22117a-72a7-4838-bb1c-111e91514b98\") " pod="openstack-operators/openstack-operator-index-5kfk4" Mar 01 09:23:48 crc kubenswrapper[4792]: I0301 09:23:48.630950 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5kfk4" Mar 01 09:23:49 crc kubenswrapper[4792]: I0301 09:23:49.039095 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5kfk4"] Mar 01 09:23:49 crc kubenswrapper[4792]: W0301 09:23:49.044526 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc22117a_72a7_4838_bb1c_111e91514b98.slice/crio-00e2da99ed5c797263d32d1b50464306d1a3e39c15fbab96cd76bab996b0fa34 WatchSource:0}: Error finding container 00e2da99ed5c797263d32d1b50464306d1a3e39c15fbab96cd76bab996b0fa34: Status 404 returned error can't find the container with id 00e2da99ed5c797263d32d1b50464306d1a3e39c15fbab96cd76bab996b0fa34 Mar 01 09:23:49 crc kubenswrapper[4792]: I0301 09:23:49.088272 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-wsvzs" podUID="00b31e3f-8443-487a-916a-59ec98ccd4de" containerName="registry-server" containerID="cri-o://059995c34f2b45f2d80e005bcf0169eacc91be21aa2b7c23d727bfaab5828afe" gracePeriod=2 Mar 01 09:23:49 crc kubenswrapper[4792]: I0301 09:23:49.088649 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5kfk4" event={"ID":"dc22117a-72a7-4838-bb1c-111e91514b98","Type":"ContainerStarted","Data":"00e2da99ed5c797263d32d1b50464306d1a3e39c15fbab96cd76bab996b0fa34"} Mar 01 09:23:49 crc kubenswrapper[4792]: I0301 09:23:49.457505 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wsvzs" Mar 01 09:23:49 crc kubenswrapper[4792]: I0301 09:23:49.652771 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vn9gn\" (UniqueName: \"kubernetes.io/projected/00b31e3f-8443-487a-916a-59ec98ccd4de-kube-api-access-vn9gn\") pod \"00b31e3f-8443-487a-916a-59ec98ccd4de\" (UID: \"00b31e3f-8443-487a-916a-59ec98ccd4de\") " Mar 01 09:23:49 crc kubenswrapper[4792]: I0301 09:23:49.658954 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00b31e3f-8443-487a-916a-59ec98ccd4de-kube-api-access-vn9gn" (OuterVolumeSpecName: "kube-api-access-vn9gn") pod "00b31e3f-8443-487a-916a-59ec98ccd4de" (UID: "00b31e3f-8443-487a-916a-59ec98ccd4de"). InnerVolumeSpecName "kube-api-access-vn9gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:23:49 crc kubenswrapper[4792]: I0301 09:23:49.754684 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vn9gn\" (UniqueName: \"kubernetes.io/projected/00b31e3f-8443-487a-916a-59ec98ccd4de-kube-api-access-vn9gn\") on node \"crc\" DevicePath \"\"" Mar 01 09:23:50 crc kubenswrapper[4792]: I0301 09:23:50.094729 4792 generic.go:334] "Generic (PLEG): container finished" podID="00b31e3f-8443-487a-916a-59ec98ccd4de" containerID="059995c34f2b45f2d80e005bcf0169eacc91be21aa2b7c23d727bfaab5828afe" exitCode=0 Mar 01 09:23:50 crc kubenswrapper[4792]: I0301 09:23:50.094814 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wsvzs" event={"ID":"00b31e3f-8443-487a-916a-59ec98ccd4de","Type":"ContainerDied","Data":"059995c34f2b45f2d80e005bcf0169eacc91be21aa2b7c23d727bfaab5828afe"} Mar 01 09:23:50 crc kubenswrapper[4792]: I0301 09:23:50.094846 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wsvzs" event={"ID":"00b31e3f-8443-487a-916a-59ec98ccd4de","Type":"ContainerDied","Data":"e98ee9c3b90fa09d03524b35e6a99ec6a5872d1dba45dc8484cb95ef1f21e4e3"} Mar 01 09:23:50 crc kubenswrapper[4792]: I0301 09:23:50.094868 4792 scope.go:117] "RemoveContainer" containerID="059995c34f2b45f2d80e005bcf0169eacc91be21aa2b7c23d727bfaab5828afe" Mar 01 09:23:50 crc kubenswrapper[4792]: I0301 09:23:50.095014 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wsvzs" Mar 01 09:23:50 crc kubenswrapper[4792]: I0301 09:23:50.101195 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5kfk4" event={"ID":"dc22117a-72a7-4838-bb1c-111e91514b98","Type":"ContainerStarted","Data":"b278cb80c20ca321724b63e627678ef7ebad47bdb0fac71db7314c188b072401"} Mar 01 09:23:50 crc kubenswrapper[4792]: I0301 09:23:50.116543 4792 scope.go:117] "RemoveContainer" containerID="059995c34f2b45f2d80e005bcf0169eacc91be21aa2b7c23d727bfaab5828afe" Mar 01 09:23:50 crc kubenswrapper[4792]: E0301 09:23:50.116958 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"059995c34f2b45f2d80e005bcf0169eacc91be21aa2b7c23d727bfaab5828afe\": container with ID starting with 059995c34f2b45f2d80e005bcf0169eacc91be21aa2b7c23d727bfaab5828afe not found: ID does not exist" containerID="059995c34f2b45f2d80e005bcf0169eacc91be21aa2b7c23d727bfaab5828afe" Mar 01 09:23:50 crc kubenswrapper[4792]: I0301 09:23:50.116993 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"059995c34f2b45f2d80e005bcf0169eacc91be21aa2b7c23d727bfaab5828afe"} err="failed to get container status \"059995c34f2b45f2d80e005bcf0169eacc91be21aa2b7c23d727bfaab5828afe\": rpc error: code = NotFound desc = could not find container \"059995c34f2b45f2d80e005bcf0169eacc91be21aa2b7c23d727bfaab5828afe\": container with ID starting with 059995c34f2b45f2d80e005bcf0169eacc91be21aa2b7c23d727bfaab5828afe not found: ID does not exist" Mar 01 09:23:50 crc kubenswrapper[4792]: I0301 09:23:50.130196 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-5kfk4" podStartSLOduration=1.752191838 podStartE2EDuration="2.13015009s" podCreationTimestamp="2026-03-01 09:23:48 +0000 UTC" firstStartedPulling="2026-03-01 09:23:49.049030466 +0000 UTC m=+958.290909663" lastFinishedPulling="2026-03-01 09:23:49.426988718 +0000 UTC m=+958.668867915" observedRunningTime="2026-03-01 09:23:50.120100983 +0000 UTC m=+959.361980190" watchObservedRunningTime="2026-03-01 09:23:50.13015009 +0000 UTC m=+959.372029287" Mar 01 09:23:50 crc kubenswrapper[4792]: I0301 09:23:50.141120 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-wsvzs"] Mar 01 09:23:50 crc kubenswrapper[4792]: I0301 09:23:50.144273 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-wsvzs"] Mar 01 09:23:51 crc kubenswrapper[4792]: I0301 09:23:51.423330 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00b31e3f-8443-487a-916a-59ec98ccd4de" path="/var/lib/kubelet/pods/00b31e3f-8443-487a-916a-59ec98ccd4de/volumes" Mar 01 09:23:58 crc kubenswrapper[4792]: I0301 09:23:58.632127 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-5kfk4" Mar 01 09:23:58 crc kubenswrapper[4792]: I0301 09:23:58.632485 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-5kfk4" Mar 01 09:23:58 crc kubenswrapper[4792]: I0301 09:23:58.660934 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-5kfk4" Mar 01 09:23:59 crc kubenswrapper[4792]: I0301 09:23:59.193380 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-5kfk4" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.124324 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539284-g5rbc"] Mar 01 09:24:00 crc kubenswrapper[4792]: E0301 09:24:00.124858 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00b31e3f-8443-487a-916a-59ec98ccd4de" containerName="registry-server" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.124873 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="00b31e3f-8443-487a-916a-59ec98ccd4de" containerName="registry-server" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.125022 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="00b31e3f-8443-487a-916a-59ec98ccd4de" containerName="registry-server" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.125455 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539284-g5rbc" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.128246 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.128899 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.135073 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539284-g5rbc"] Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.178220 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.191187 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2"] Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.192525 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.195072 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d4447fd9-d2df-47f4-a94f-ff8b4c5080bd-util\") pod \"905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2\" (UID: \"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd\") " pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.195185 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqkqn\" (UniqueName: \"kubernetes.io/projected/d4447fd9-d2df-47f4-a94f-ff8b4c5080bd-kube-api-access-cqkqn\") pod \"905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2\" (UID: \"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd\") " pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.195241 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d4447fd9-d2df-47f4-a94f-ff8b4c5080bd-bundle\") pod \"905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2\" (UID: \"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd\") " pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.195276 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwt58\" (UniqueName: \"kubernetes.io/projected/1263f40a-23c7-4ab8-8ebc-7c697e2eacd6-kube-api-access-nwt58\") pod \"auto-csr-approver-29539284-g5rbc\" (UID: \"1263f40a-23c7-4ab8-8ebc-7c697e2eacd6\") " pod="openshift-infra/auto-csr-approver-29539284-g5rbc" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.206501 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-s7c52" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.212625 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2"] Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.296498 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d4447fd9-d2df-47f4-a94f-ff8b4c5080bd-util\") pod \"905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2\" (UID: \"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd\") " pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.296560 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqkqn\" (UniqueName: \"kubernetes.io/projected/d4447fd9-d2df-47f4-a94f-ff8b4c5080bd-kube-api-access-cqkqn\") pod \"905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2\" (UID: \"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd\") " pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.296587 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d4447fd9-d2df-47f4-a94f-ff8b4c5080bd-bundle\") pod \"905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2\" (UID: \"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd\") " pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.296614 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwt58\" (UniqueName: \"kubernetes.io/projected/1263f40a-23c7-4ab8-8ebc-7c697e2eacd6-kube-api-access-nwt58\") pod \"auto-csr-approver-29539284-g5rbc\" (UID: \"1263f40a-23c7-4ab8-8ebc-7c697e2eacd6\") " pod="openshift-infra/auto-csr-approver-29539284-g5rbc" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.296928 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d4447fd9-d2df-47f4-a94f-ff8b4c5080bd-util\") pod \"905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2\" (UID: \"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd\") " pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.300179 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d4447fd9-d2df-47f4-a94f-ff8b4c5080bd-bundle\") pod \"905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2\" (UID: \"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd\") " pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.316590 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqkqn\" (UniqueName: \"kubernetes.io/projected/d4447fd9-d2df-47f4-a94f-ff8b4c5080bd-kube-api-access-cqkqn\") pod \"905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2\" (UID: \"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd\") " pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.317022 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwt58\" (UniqueName: \"kubernetes.io/projected/1263f40a-23c7-4ab8-8ebc-7c697e2eacd6-kube-api-access-nwt58\") pod \"auto-csr-approver-29539284-g5rbc\" (UID: \"1263f40a-23c7-4ab8-8ebc-7c697e2eacd6\") " pod="openshift-infra/auto-csr-approver-29539284-g5rbc" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.489102 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539284-g5rbc" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.507427 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2" Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.765462 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2"] Mar 01 09:24:00 crc kubenswrapper[4792]: I0301 09:24:00.900096 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539284-g5rbc"] Mar 01 09:24:00 crc kubenswrapper[4792]: W0301 09:24:00.904652 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1263f40a_23c7_4ab8_8ebc_7c697e2eacd6.slice/crio-6d4a810ccd8818aa2aa4996c08f4aecc31ca3e184af445411fec70a53dd8b3d3 WatchSource:0}: Error finding container 6d4a810ccd8818aa2aa4996c08f4aecc31ca3e184af445411fec70a53dd8b3d3: Status 404 returned error can't find the container with id 6d4a810ccd8818aa2aa4996c08f4aecc31ca3e184af445411fec70a53dd8b3d3 Mar 01 09:24:01 crc kubenswrapper[4792]: I0301 09:24:01.199552 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539284-g5rbc" event={"ID":"1263f40a-23c7-4ab8-8ebc-7c697e2eacd6","Type":"ContainerStarted","Data":"6d4a810ccd8818aa2aa4996c08f4aecc31ca3e184af445411fec70a53dd8b3d3"} Mar 01 09:24:01 crc kubenswrapper[4792]: I0301 09:24:01.202104 4792 generic.go:334] "Generic (PLEG): container finished" podID="d4447fd9-d2df-47f4-a94f-ff8b4c5080bd" containerID="e303016694d80717056bed8ebe7ac0ed91d935075c1a2c8363a48e9a34247c9f" exitCode=0 Mar 01 09:24:01 crc kubenswrapper[4792]: I0301 09:24:01.202361 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2" event={"ID":"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd","Type":"ContainerDied","Data":"e303016694d80717056bed8ebe7ac0ed91d935075c1a2c8363a48e9a34247c9f"} Mar 01 09:24:01 crc kubenswrapper[4792]: I0301 09:24:01.202393 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2" event={"ID":"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd","Type":"ContainerStarted","Data":"901d71ab1004a975c28ad1740acca30e4ad62081d579542b1327000938dbb1e0"} Mar 01 09:24:02 crc kubenswrapper[4792]: I0301 09:24:02.216626 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539284-g5rbc" event={"ID":"1263f40a-23c7-4ab8-8ebc-7c697e2eacd6","Type":"ContainerStarted","Data":"25e2a65861bddb5cf69014ea8d6e4a60ec1aeeeac6538cad770632d905286110"} Mar 01 09:24:02 crc kubenswrapper[4792]: I0301 09:24:02.221133 4792 generic.go:334] "Generic (PLEG): container finished" podID="d4447fd9-d2df-47f4-a94f-ff8b4c5080bd" containerID="857a9688c261d295e943ef630ce635540fb82ab9abcae64b93c371953ef33783" exitCode=0 Mar 01 09:24:02 crc kubenswrapper[4792]: I0301 09:24:02.221259 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2" event={"ID":"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd","Type":"ContainerDied","Data":"857a9688c261d295e943ef630ce635540fb82ab9abcae64b93c371953ef33783"} Mar 01 09:24:02 crc kubenswrapper[4792]: I0301 09:24:02.237980 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29539284-g5rbc" podStartSLOduration=1.290508193 podStartE2EDuration="2.237961349s" podCreationTimestamp="2026-03-01 09:24:00 +0000 UTC" firstStartedPulling="2026-03-01 09:24:00.906807599 +0000 UTC m=+970.148686816" lastFinishedPulling="2026-03-01 09:24:01.854260765 +0000 UTC m=+971.096139972" observedRunningTime="2026-03-01 09:24:02.234848322 +0000 UTC m=+971.476727549" watchObservedRunningTime="2026-03-01 09:24:02.237961349 +0000 UTC m=+971.479840556" Mar 01 09:24:03 crc kubenswrapper[4792]: I0301 09:24:03.232552 4792 generic.go:334] "Generic (PLEG): container finished" podID="d4447fd9-d2df-47f4-a94f-ff8b4c5080bd" containerID="6ff8df30eacd38cf3a14434c98b1a8b0327d53de2b6a40501367acbf0b0e60ea" exitCode=0 Mar 01 09:24:03 crc kubenswrapper[4792]: I0301 09:24:03.232651 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2" event={"ID":"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd","Type":"ContainerDied","Data":"6ff8df30eacd38cf3a14434c98b1a8b0327d53de2b6a40501367acbf0b0e60ea"} Mar 01 09:24:03 crc kubenswrapper[4792]: I0301 09:24:03.235987 4792 generic.go:334] "Generic (PLEG): container finished" podID="1263f40a-23c7-4ab8-8ebc-7c697e2eacd6" containerID="25e2a65861bddb5cf69014ea8d6e4a60ec1aeeeac6538cad770632d905286110" exitCode=0 Mar 01 09:24:03 crc kubenswrapper[4792]: I0301 09:24:03.236079 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539284-g5rbc" event={"ID":"1263f40a-23c7-4ab8-8ebc-7c697e2eacd6","Type":"ContainerDied","Data":"25e2a65861bddb5cf69014ea8d6e4a60ec1aeeeac6538cad770632d905286110"} Mar 01 09:24:04 crc kubenswrapper[4792]: I0301 09:24:04.488279 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539284-g5rbc" Mar 01 09:24:04 crc kubenswrapper[4792]: I0301 09:24:04.565628 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwt58\" (UniqueName: \"kubernetes.io/projected/1263f40a-23c7-4ab8-8ebc-7c697e2eacd6-kube-api-access-nwt58\") pod \"1263f40a-23c7-4ab8-8ebc-7c697e2eacd6\" (UID: \"1263f40a-23c7-4ab8-8ebc-7c697e2eacd6\") " Mar 01 09:24:04 crc kubenswrapper[4792]: I0301 09:24:04.570723 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1263f40a-23c7-4ab8-8ebc-7c697e2eacd6-kube-api-access-nwt58" (OuterVolumeSpecName: "kube-api-access-nwt58") pod "1263f40a-23c7-4ab8-8ebc-7c697e2eacd6" (UID: "1263f40a-23c7-4ab8-8ebc-7c697e2eacd6"). InnerVolumeSpecName "kube-api-access-nwt58". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:24:04 crc kubenswrapper[4792]: I0301 09:24:04.571944 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2" Mar 01 09:24:04 crc kubenswrapper[4792]: I0301 09:24:04.667067 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d4447fd9-d2df-47f4-a94f-ff8b4c5080bd-util\") pod \"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd\" (UID: \"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd\") " Mar 01 09:24:04 crc kubenswrapper[4792]: I0301 09:24:04.667186 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d4447fd9-d2df-47f4-a94f-ff8b4c5080bd-bundle\") pod \"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd\" (UID: \"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd\") " Mar 01 09:24:04 crc kubenswrapper[4792]: I0301 09:24:04.667212 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqkqn\" (UniqueName: \"kubernetes.io/projected/d4447fd9-d2df-47f4-a94f-ff8b4c5080bd-kube-api-access-cqkqn\") pod \"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd\" (UID: \"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd\") " Mar 01 09:24:04 crc kubenswrapper[4792]: I0301 09:24:04.667402 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwt58\" (UniqueName: \"kubernetes.io/projected/1263f40a-23c7-4ab8-8ebc-7c697e2eacd6-kube-api-access-nwt58\") on node \"crc\" DevicePath \"\"" Mar 01 09:24:04 crc kubenswrapper[4792]: I0301 09:24:04.668380 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4447fd9-d2df-47f4-a94f-ff8b4c5080bd-bundle" (OuterVolumeSpecName: "bundle") pod "d4447fd9-d2df-47f4-a94f-ff8b4c5080bd" (UID: "d4447fd9-d2df-47f4-a94f-ff8b4c5080bd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:24:04 crc kubenswrapper[4792]: I0301 09:24:04.670428 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4447fd9-d2df-47f4-a94f-ff8b4c5080bd-kube-api-access-cqkqn" (OuterVolumeSpecName: "kube-api-access-cqkqn") pod "d4447fd9-d2df-47f4-a94f-ff8b4c5080bd" (UID: "d4447fd9-d2df-47f4-a94f-ff8b4c5080bd"). InnerVolumeSpecName "kube-api-access-cqkqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:24:04 crc kubenswrapper[4792]: I0301 09:24:04.683048 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4447fd9-d2df-47f4-a94f-ff8b4c5080bd-util" (OuterVolumeSpecName: "util") pod "d4447fd9-d2df-47f4-a94f-ff8b4c5080bd" (UID: "d4447fd9-d2df-47f4-a94f-ff8b4c5080bd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:24:04 crc kubenswrapper[4792]: I0301 09:24:04.768467 4792 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d4447fd9-d2df-47f4-a94f-ff8b4c5080bd-util\") on node \"crc\" DevicePath \"\"" Mar 01 09:24:04 crc kubenswrapper[4792]: I0301 09:24:04.768708 4792 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d4447fd9-d2df-47f4-a94f-ff8b4c5080bd-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:24:04 crc kubenswrapper[4792]: I0301 09:24:04.768790 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqkqn\" (UniqueName: \"kubernetes.io/projected/d4447fd9-d2df-47f4-a94f-ff8b4c5080bd-kube-api-access-cqkqn\") on node \"crc\" DevicePath \"\"" Mar 01 09:24:05 crc kubenswrapper[4792]: I0301 09:24:05.251264 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539284-g5rbc" event={"ID":"1263f40a-23c7-4ab8-8ebc-7c697e2eacd6","Type":"ContainerDied","Data":"6d4a810ccd8818aa2aa4996c08f4aecc31ca3e184af445411fec70a53dd8b3d3"} Mar 01 09:24:05 crc kubenswrapper[4792]: I0301 09:24:05.251624 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d4a810ccd8818aa2aa4996c08f4aecc31ca3e184af445411fec70a53dd8b3d3" Mar 01 09:24:05 crc kubenswrapper[4792]: I0301 09:24:05.251314 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539284-g5rbc" Mar 01 09:24:05 crc kubenswrapper[4792]: I0301 09:24:05.253455 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2" event={"ID":"d4447fd9-d2df-47f4-a94f-ff8b4c5080bd","Type":"ContainerDied","Data":"901d71ab1004a975c28ad1740acca30e4ad62081d579542b1327000938dbb1e0"} Mar 01 09:24:05 crc kubenswrapper[4792]: I0301 09:24:05.253493 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="901d71ab1004a975c28ad1740acca30e4ad62081d579542b1327000938dbb1e0" Mar 01 09:24:05 crc kubenswrapper[4792]: I0301 09:24:05.253595 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2" Mar 01 09:24:05 crc kubenswrapper[4792]: I0301 09:24:05.546037 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539278-dkkbp"] Mar 01 09:24:05 crc kubenswrapper[4792]: I0301 09:24:05.551435 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539278-dkkbp"] Mar 01 09:24:07 crc kubenswrapper[4792]: I0301 09:24:07.416878 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d17aa56-5b61-403d-9d20-cb300aabc44d" path="/var/lib/kubelet/pods/0d17aa56-5b61-403d-9d20-cb300aabc44d/volumes" Mar 01 09:24:12 crc kubenswrapper[4792]: I0301 09:24:12.753265 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-595c94944c-vtchh"] Mar 01 09:24:12 crc kubenswrapper[4792]: E0301 09:24:12.754723 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4447fd9-d2df-47f4-a94f-ff8b4c5080bd" containerName="util" Mar 01 09:24:12 crc kubenswrapper[4792]: I0301 09:24:12.754824 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4447fd9-d2df-47f4-a94f-ff8b4c5080bd" containerName="util" Mar 01 09:24:12 crc kubenswrapper[4792]: E0301 09:24:12.754886 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1263f40a-23c7-4ab8-8ebc-7c697e2eacd6" containerName="oc" Mar 01 09:24:12 crc kubenswrapper[4792]: I0301 09:24:12.755725 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1263f40a-23c7-4ab8-8ebc-7c697e2eacd6" containerName="oc" Mar 01 09:24:12 crc kubenswrapper[4792]: E0301 09:24:12.755966 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4447fd9-d2df-47f4-a94f-ff8b4c5080bd" containerName="extract" Mar 01 09:24:12 crc kubenswrapper[4792]: I0301 09:24:12.756129 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4447fd9-d2df-47f4-a94f-ff8b4c5080bd" containerName="extract" Mar 01 09:24:12 crc kubenswrapper[4792]: E0301 09:24:12.756245 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4447fd9-d2df-47f4-a94f-ff8b4c5080bd" containerName="pull" Mar 01 09:24:12 crc kubenswrapper[4792]: I0301 09:24:12.757178 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4447fd9-d2df-47f4-a94f-ff8b4c5080bd" containerName="pull" Mar 01 09:24:12 crc kubenswrapper[4792]: I0301 09:24:12.757589 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4447fd9-d2df-47f4-a94f-ff8b4c5080bd" containerName="extract" Mar 01 09:24:12 crc kubenswrapper[4792]: I0301 09:24:12.757736 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1263f40a-23c7-4ab8-8ebc-7c697e2eacd6" containerName="oc" Mar 01 09:24:12 crc kubenswrapper[4792]: I0301 09:24:12.758612 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-595c94944c-vtchh" Mar 01 09:24:12 crc kubenswrapper[4792]: I0301 09:24:12.764500 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-dq8rd" Mar 01 09:24:12 crc kubenswrapper[4792]: I0301 09:24:12.772974 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfs7s\" (UniqueName: \"kubernetes.io/projected/c967e6f5-6388-4ae5-9ccf-379b6305e1b0-kube-api-access-jfs7s\") pod \"openstack-operator-controller-init-595c94944c-vtchh\" (UID: \"c967e6f5-6388-4ae5-9ccf-379b6305e1b0\") " pod="openstack-operators/openstack-operator-controller-init-595c94944c-vtchh" Mar 01 09:24:12 crc kubenswrapper[4792]: I0301 09:24:12.783450 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-595c94944c-vtchh"] Mar 01 09:24:12 crc kubenswrapper[4792]: I0301 09:24:12.874615 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfs7s\" (UniqueName: \"kubernetes.io/projected/c967e6f5-6388-4ae5-9ccf-379b6305e1b0-kube-api-access-jfs7s\") pod \"openstack-operator-controller-init-595c94944c-vtchh\" (UID: \"c967e6f5-6388-4ae5-9ccf-379b6305e1b0\") " pod="openstack-operators/openstack-operator-controller-init-595c94944c-vtchh" Mar 01 09:24:12 crc kubenswrapper[4792]: I0301 09:24:12.904495 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfs7s\" (UniqueName: \"kubernetes.io/projected/c967e6f5-6388-4ae5-9ccf-379b6305e1b0-kube-api-access-jfs7s\") pod \"openstack-operator-controller-init-595c94944c-vtchh\" (UID: \"c967e6f5-6388-4ae5-9ccf-379b6305e1b0\") " pod="openstack-operators/openstack-operator-controller-init-595c94944c-vtchh" Mar 01 09:24:13 crc kubenswrapper[4792]: I0301 09:24:13.072571 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-595c94944c-vtchh" Mar 01 09:24:13 crc kubenswrapper[4792]: I0301 09:24:13.537393 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-595c94944c-vtchh"] Mar 01 09:24:13 crc kubenswrapper[4792]: W0301 09:24:13.541501 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc967e6f5_6388_4ae5_9ccf_379b6305e1b0.slice/crio-f586b15186f380abfcd39f1711097e90a17d914ffbbee23bffaf50c18c7555d4 WatchSource:0}: Error finding container f586b15186f380abfcd39f1711097e90a17d914ffbbee23bffaf50c18c7555d4: Status 404 returned error can't find the container with id f586b15186f380abfcd39f1711097e90a17d914ffbbee23bffaf50c18c7555d4 Mar 01 09:24:14 crc kubenswrapper[4792]: I0301 09:24:14.309038 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-595c94944c-vtchh" event={"ID":"c967e6f5-6388-4ae5-9ccf-379b6305e1b0","Type":"ContainerStarted","Data":"f586b15186f380abfcd39f1711097e90a17d914ffbbee23bffaf50c18c7555d4"} Mar 01 09:24:18 crc kubenswrapper[4792]: I0301 09:24:18.349811 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-595c94944c-vtchh" event={"ID":"c967e6f5-6388-4ae5-9ccf-379b6305e1b0","Type":"ContainerStarted","Data":"7c99d3874715f2ca02422dd8190b6185d9da40caf33a75c71acb9739fc7fe999"} Mar 01 09:24:18 crc kubenswrapper[4792]: I0301 09:24:18.350456 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-595c94944c-vtchh" Mar 01 09:24:18 crc kubenswrapper[4792]: I0301 09:24:18.385673 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-595c94944c-vtchh" podStartSLOduration=1.775006104 podStartE2EDuration="6.385656424s" podCreationTimestamp="2026-03-01 09:24:12 +0000 UTC" firstStartedPulling="2026-03-01 09:24:13.545318599 +0000 UTC m=+982.787197796" lastFinishedPulling="2026-03-01 09:24:18.155968919 +0000 UTC m=+987.397848116" observedRunningTime="2026-03-01 09:24:18.380793065 +0000 UTC m=+987.622672262" watchObservedRunningTime="2026-03-01 09:24:18.385656424 +0000 UTC m=+987.627535621" Mar 01 09:24:23 crc kubenswrapper[4792]: I0301 09:24:23.075414 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-595c94944c-vtchh" Mar 01 09:24:37 crc kubenswrapper[4792]: I0301 09:24:37.487221 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-29ht5"] Mar 01 09:24:37 crc kubenswrapper[4792]: I0301 09:24:37.488751 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-29ht5" Mar 01 09:24:37 crc kubenswrapper[4792]: I0301 09:24:37.507935 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-29ht5"] Mar 01 09:24:37 crc kubenswrapper[4792]: I0301 09:24:37.602087 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15fa5cd2-57f2-4589-9947-c4a227fa68b6-utilities\") pod \"redhat-marketplace-29ht5\" (UID: \"15fa5cd2-57f2-4589-9947-c4a227fa68b6\") " pod="openshift-marketplace/redhat-marketplace-29ht5" Mar 01 09:24:37 crc kubenswrapper[4792]: I0301 09:24:37.602130 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtpbt\" (UniqueName: \"kubernetes.io/projected/15fa5cd2-57f2-4589-9947-c4a227fa68b6-kube-api-access-dtpbt\") pod \"redhat-marketplace-29ht5\" (UID: \"15fa5cd2-57f2-4589-9947-c4a227fa68b6\") " pod="openshift-marketplace/redhat-marketplace-29ht5" Mar 01 09:24:37 crc kubenswrapper[4792]: I0301 09:24:37.602178 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15fa5cd2-57f2-4589-9947-c4a227fa68b6-catalog-content\") pod \"redhat-marketplace-29ht5\" (UID: \"15fa5cd2-57f2-4589-9947-c4a227fa68b6\") " pod="openshift-marketplace/redhat-marketplace-29ht5" Mar 01 09:24:37 crc kubenswrapper[4792]: I0301 09:24:37.703128 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15fa5cd2-57f2-4589-9947-c4a227fa68b6-utilities\") pod \"redhat-marketplace-29ht5\" (UID: \"15fa5cd2-57f2-4589-9947-c4a227fa68b6\") " pod="openshift-marketplace/redhat-marketplace-29ht5" Mar 01 09:24:37 crc kubenswrapper[4792]: I0301 09:24:37.703170 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtpbt\" (UniqueName: \"kubernetes.io/projected/15fa5cd2-57f2-4589-9947-c4a227fa68b6-kube-api-access-dtpbt\") pod \"redhat-marketplace-29ht5\" (UID: \"15fa5cd2-57f2-4589-9947-c4a227fa68b6\") " pod="openshift-marketplace/redhat-marketplace-29ht5" Mar 01 09:24:37 crc kubenswrapper[4792]: I0301 09:24:37.703206 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15fa5cd2-57f2-4589-9947-c4a227fa68b6-catalog-content\") pod \"redhat-marketplace-29ht5\" (UID: \"15fa5cd2-57f2-4589-9947-c4a227fa68b6\") " pod="openshift-marketplace/redhat-marketplace-29ht5" Mar 01 09:24:37 crc kubenswrapper[4792]: I0301 09:24:37.703673 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15fa5cd2-57f2-4589-9947-c4a227fa68b6-utilities\") pod \"redhat-marketplace-29ht5\" (UID: \"15fa5cd2-57f2-4589-9947-c4a227fa68b6\") " pod="openshift-marketplace/redhat-marketplace-29ht5" Mar 01 09:24:37 crc kubenswrapper[4792]: I0301 09:24:37.703716 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15fa5cd2-57f2-4589-9947-c4a227fa68b6-catalog-content\") pod \"redhat-marketplace-29ht5\" (UID: \"15fa5cd2-57f2-4589-9947-c4a227fa68b6\") " pod="openshift-marketplace/redhat-marketplace-29ht5" Mar 01 09:24:37 crc kubenswrapper[4792]: I0301 09:24:37.733457 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtpbt\" (UniqueName: \"kubernetes.io/projected/15fa5cd2-57f2-4589-9947-c4a227fa68b6-kube-api-access-dtpbt\") pod \"redhat-marketplace-29ht5\" (UID: \"15fa5cd2-57f2-4589-9947-c4a227fa68b6\") " pod="openshift-marketplace/redhat-marketplace-29ht5" Mar 01 09:24:37 crc kubenswrapper[4792]: I0301 09:24:37.804294 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-29ht5" Mar 01 09:24:38 crc kubenswrapper[4792]: I0301 09:24:38.215677 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-29ht5"] Mar 01 09:24:38 crc kubenswrapper[4792]: I0301 09:24:38.466236 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-29ht5" event={"ID":"15fa5cd2-57f2-4589-9947-c4a227fa68b6","Type":"ContainerStarted","Data":"2712402bbe2a400f1f172cc0f249c7e35edf7b64593d06c6d1cfd9d81ee06f57"} Mar 01 09:24:38 crc kubenswrapper[4792]: I0301 09:24:38.466575 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-29ht5" event={"ID":"15fa5cd2-57f2-4589-9947-c4a227fa68b6","Type":"ContainerStarted","Data":"5ee4a8e4800ff27037b52aa58b081311f95a6ba4d258c46fcee562038196b6f2"} Mar 01 09:24:39 crc kubenswrapper[4792]: I0301 09:24:39.472117 4792 generic.go:334] "Generic (PLEG): container finished" podID="15fa5cd2-57f2-4589-9947-c4a227fa68b6" containerID="2712402bbe2a400f1f172cc0f249c7e35edf7b64593d06c6d1cfd9d81ee06f57" exitCode=0 Mar 01 09:24:39 crc kubenswrapper[4792]: I0301 09:24:39.472154 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-29ht5" event={"ID":"15fa5cd2-57f2-4589-9947-c4a227fa68b6","Type":"ContainerDied","Data":"2712402bbe2a400f1f172cc0f249c7e35edf7b64593d06c6d1cfd9d81ee06f57"} Mar 01 09:24:40 crc kubenswrapper[4792]: I0301 09:24:40.480685 4792 generic.go:334] "Generic (PLEG): container finished" podID="15fa5cd2-57f2-4589-9947-c4a227fa68b6" containerID="f4b48983a710c40494648dd6a515d3975deee5c28f7b927750a63de93e040785" exitCode=0 Mar 01 09:24:40 crc kubenswrapper[4792]: I0301 09:24:40.480791 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-29ht5" event={"ID":"15fa5cd2-57f2-4589-9947-c4a227fa68b6","Type":"ContainerDied","Data":"f4b48983a710c40494648dd6a515d3975deee5c28f7b927750a63de93e040785"} Mar 01 09:24:41 crc kubenswrapper[4792]: I0301 09:24:41.487695 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-29ht5" event={"ID":"15fa5cd2-57f2-4589-9947-c4a227fa68b6","Type":"ContainerStarted","Data":"8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1"} Mar 01 09:24:41 crc kubenswrapper[4792]: I0301 09:24:41.521512 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-29ht5" podStartSLOduration=3.123782451 podStartE2EDuration="4.521493524s" podCreationTimestamp="2026-03-01 09:24:37 +0000 UTC" firstStartedPulling="2026-03-01 09:24:39.473469596 +0000 UTC m=+1008.715348793" lastFinishedPulling="2026-03-01 09:24:40.871180659 +0000 UTC m=+1010.113059866" observedRunningTime="2026-03-01 09:24:41.51925853 +0000 UTC m=+1010.761137727" watchObservedRunningTime="2026-03-01 09:24:41.521493524 +0000 UTC m=+1010.763372721" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.552277 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-ggspg"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.553003 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-ggspg" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.562448 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jlnsb"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.563168 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jlnsb" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.563658 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-46k6h" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.564464 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj6r9\" (UniqueName: \"kubernetes.io/projected/b9e3fd6b-e3e2-4380-b8d7-900891df562a-kube-api-access-nj6r9\") pod \"barbican-operator-controller-manager-6db6876945-ggspg\" (UID: \"b9e3fd6b-e3e2-4380-b8d7-900891df562a\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-ggspg" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.564508 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jd5z\" (UniqueName: \"kubernetes.io/projected/8741a141-0194-4eb2-956e-c41f4ffe1338-kube-api-access-7jd5z\") pod \"cinder-operator-controller-manager-55d77d7b5c-jlnsb\" (UID: \"8741a141-0194-4eb2-956e-c41f4ffe1338\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jlnsb" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.569257 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-6bf7t" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.578356 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jlnsb"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.609150 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-72srw"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.609862 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-72srw" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.615063 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-v4bn8" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.616544 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-ggspg"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.632840 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-9wzbh"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.633605 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9wzbh" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.644786 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-72srw"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.645194 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-btdr4" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.656876 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-9wzbh"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.665436 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jd5z\" (UniqueName: \"kubernetes.io/projected/8741a141-0194-4eb2-956e-c41f4ffe1338-kube-api-access-7jd5z\") pod \"cinder-operator-controller-manager-55d77d7b5c-jlnsb\" (UID: \"8741a141-0194-4eb2-956e-c41f4ffe1338\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jlnsb" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.665503 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxh8z\" (UniqueName: \"kubernetes.io/projected/bf1f37ea-a566-4dfd-b45b-02f284f19ce3-kube-api-access-lxh8z\") pod \"designate-operator-controller-manager-5d87c9d997-72srw\" (UID: \"bf1f37ea-a566-4dfd-b45b-02f284f19ce3\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-72srw" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.665531 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqf5f\" (UniqueName: \"kubernetes.io/projected/02dd5cc0-c44b-4ede-972b-9d26c9c54100-kube-api-access-jqf5f\") pod \"glance-operator-controller-manager-64db6967f8-9wzbh\" (UID: \"02dd5cc0-c44b-4ede-972b-9d26c9c54100\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9wzbh" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.665567 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj6r9\" (UniqueName: \"kubernetes.io/projected/b9e3fd6b-e3e2-4380-b8d7-900891df562a-kube-api-access-nj6r9\") pod \"barbican-operator-controller-manager-6db6876945-ggspg\" (UID: \"b9e3fd6b-e3e2-4380-b8d7-900891df562a\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-ggspg" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.701686 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj6r9\" (UniqueName: \"kubernetes.io/projected/b9e3fd6b-e3e2-4380-b8d7-900891df562a-kube-api-access-nj6r9\") pod \"barbican-operator-controller-manager-6db6876945-ggspg\" (UID: \"b9e3fd6b-e3e2-4380-b8d7-900891df562a\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-ggspg" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.705448 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jd5z\" (UniqueName: \"kubernetes.io/projected/8741a141-0194-4eb2-956e-c41f4ffe1338-kube-api-access-7jd5z\") pod \"cinder-operator-controller-manager-55d77d7b5c-jlnsb\" (UID: \"8741a141-0194-4eb2-956e-c41f4ffe1338\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jlnsb" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.716854 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-7v65r"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.727241 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-7v65r" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.731973 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-55qzx"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.732989 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-55qzx" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.735268 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-f968d" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.743192 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-s6s9l" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.748742 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-55qzx"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.767619 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxh8z\" (UniqueName: \"kubernetes.io/projected/bf1f37ea-a566-4dfd-b45b-02f284f19ce3-kube-api-access-lxh8z\") pod \"designate-operator-controller-manager-5d87c9d997-72srw\" (UID: \"bf1f37ea-a566-4dfd-b45b-02f284f19ce3\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-72srw" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.767666 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqf5f\" (UniqueName: \"kubernetes.io/projected/02dd5cc0-c44b-4ede-972b-9d26c9c54100-kube-api-access-jqf5f\") pod \"glance-operator-controller-manager-64db6967f8-9wzbh\" (UID: \"02dd5cc0-c44b-4ede-972b-9d26c9c54100\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9wzbh" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.776100 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-7v65r"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.791289 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.792159 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.806375 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.806612 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-vn8ng" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.813296 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqf5f\" (UniqueName: \"kubernetes.io/projected/02dd5cc0-c44b-4ede-972b-9d26c9c54100-kube-api-access-jqf5f\") pod \"glance-operator-controller-manager-64db6967f8-9wzbh\" (UID: \"02dd5cc0-c44b-4ede-972b-9d26c9c54100\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9wzbh" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.828006 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.830547 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxh8z\" (UniqueName: \"kubernetes.io/projected/bf1f37ea-a566-4dfd-b45b-02f284f19ce3-kube-api-access-lxh8z\") pod \"designate-operator-controller-manager-5d87c9d997-72srw\" (UID: \"bf1f37ea-a566-4dfd-b45b-02f284f19ce3\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-72srw" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.838348 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-jvw5j"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.838954 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-wjf62"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.839420 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wjf62" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.839725 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-jvw5j" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.844711 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-gqj5r" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.845033 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-sdj2j" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.856723 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-jvw5j"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.869466 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsb8q\" (UniqueName: \"kubernetes.io/projected/5044cf86-f557-41d4-b6c0-a41a668ac999-kube-api-access-vsb8q\") pod \"heat-operator-controller-manager-cf99c678f-7v65r\" (UID: \"5044cf86-f557-41d4-b6c0-a41a668ac999\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-7v65r" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.877236 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-ggspg" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.869497 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8khs\" (UniqueName: \"kubernetes.io/projected/cd83ed19-023d-43c2-92db-d290499db3d4-kube-api-access-d8khs\") pod \"horizon-operator-controller-manager-78bc7f9bd9-55qzx\" (UID: \"cd83ed19-023d-43c2-92db-d290499db3d4\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-55qzx" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.879351 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t6l9\" (UniqueName: \"kubernetes.io/projected/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-kube-api-access-7t6l9\") pod \"infra-operator-controller-manager-f7fcc58b9-dsqtf\" (UID: \"ea6739c2-185a-43e7-8fcf-0b2ae31957a0\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.879384 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7q8t\" (UniqueName: \"kubernetes.io/projected/2af4993f-9ba9-4f7a-a31e-2bd133a7d4c5-kube-api-access-c7q8t\") pod \"ironic-operator-controller-manager-545456dc4-jvw5j\" (UID: \"2af4993f-9ba9-4f7a-a31e-2bd133a7d4c5\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-jvw5j" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.879426 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxwh7\" (UniqueName: \"kubernetes.io/projected/234d2ae5-7589-44cc-83f4-b0ee8a91940a-kube-api-access-sxwh7\") pod \"keystone-operator-controller-manager-7c789f89c6-wjf62\" (UID: \"234d2ae5-7589-44cc-83f4-b0ee8a91940a\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wjf62" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.879458 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-dsqtf\" (UID: \"ea6739c2-185a-43e7-8fcf-0b2ae31957a0\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.883235 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-wjf62"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.888591 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jlnsb" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.909254 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-hlzm6"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.910176 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-hlzm6" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.913264 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-ltcf8" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.919358 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-t5fsn"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.920132 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-t5fsn" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.923570 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-7lngp" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.932542 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-72srw" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.934962 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-hlzm6"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.967196 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9wzbh" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.971476 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-qjqd2"] Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.972224 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-qjqd2" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.980548 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxwh7\" (UniqueName: \"kubernetes.io/projected/234d2ae5-7589-44cc-83f4-b0ee8a91940a-kube-api-access-sxwh7\") pod \"keystone-operator-controller-manager-7c789f89c6-wjf62\" (UID: \"234d2ae5-7589-44cc-83f4-b0ee8a91940a\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wjf62" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.980600 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-dsqtf\" (UID: \"ea6739c2-185a-43e7-8fcf-0b2ae31957a0\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.980629 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9d52\" (UniqueName: \"kubernetes.io/projected/376afe52-646d-44b7-b32e-ce6cd6dc21a6-kube-api-access-q9d52\") pod \"manila-operator-controller-manager-67d996989d-t5fsn\" (UID: \"376afe52-646d-44b7-b32e-ce6cd6dc21a6\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-t5fsn" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.980671 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsb8q\" (UniqueName: \"kubernetes.io/projected/5044cf86-f557-41d4-b6c0-a41a668ac999-kube-api-access-vsb8q\") pod \"heat-operator-controller-manager-cf99c678f-7v65r\" (UID: \"5044cf86-f557-41d4-b6c0-a41a668ac999\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-7v65r" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.980691 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8khs\" (UniqueName: \"kubernetes.io/projected/cd83ed19-023d-43c2-92db-d290499db3d4-kube-api-access-d8khs\") pod \"horizon-operator-controller-manager-78bc7f9bd9-55qzx\" (UID: \"cd83ed19-023d-43c2-92db-d290499db3d4\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-55qzx" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.980729 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhplh\" (UniqueName: \"kubernetes.io/projected/1793465e-1273-4250-a238-c99798788618-kube-api-access-rhplh\") pod \"mariadb-operator-controller-manager-7b6bfb6475-hlzm6\" (UID: \"1793465e-1273-4250-a238-c99798788618\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-hlzm6" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.980761 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxgmx\" (UniqueName: \"kubernetes.io/projected/dfb10d33-c4f1-4287-be83-dff835c733ba-kube-api-access-vxgmx\") pod \"neutron-operator-controller-manager-54688575f-qjqd2\" (UID: \"dfb10d33-c4f1-4287-be83-dff835c733ba\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-qjqd2" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.980789 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7q8t\" (UniqueName: \"kubernetes.io/projected/2af4993f-9ba9-4f7a-a31e-2bd133a7d4c5-kube-api-access-c7q8t\") pod \"ironic-operator-controller-manager-545456dc4-jvw5j\" (UID: \"2af4993f-9ba9-4f7a-a31e-2bd133a7d4c5\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-jvw5j" Mar 01 09:24:42 crc kubenswrapper[4792]: I0301 09:24:42.980808 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t6l9\" (UniqueName: \"kubernetes.io/projected/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-kube-api-access-7t6l9\") pod \"infra-operator-controller-manager-f7fcc58b9-dsqtf\" (UID: \"ea6739c2-185a-43e7-8fcf-0b2ae31957a0\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf" Mar 01 09:24:42 crc kubenswrapper[4792]: E0301 09:24:42.981396 4792 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 01 09:24:42 crc kubenswrapper[4792]: E0301 09:24:42.981469 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-cert podName:ea6739c2-185a-43e7-8fcf-0b2ae31957a0 nodeName:}" failed. No retries permitted until 2026-03-01 09:24:43.481448883 +0000 UTC m=+1012.723328070 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-cert") pod "infra-operator-controller-manager-f7fcc58b9-dsqtf" (UID: "ea6739c2-185a-43e7-8fcf-0b2ae31957a0") : secret "infra-operator-webhook-server-cert" not found Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.003266 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-7qcsv" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.016156 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-t5fsn"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.026628 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7q8t\" (UniqueName: \"kubernetes.io/projected/2af4993f-9ba9-4f7a-a31e-2bd133a7d4c5-kube-api-access-c7q8t\") pod \"ironic-operator-controller-manager-545456dc4-jvw5j\" (UID: \"2af4993f-9ba9-4f7a-a31e-2bd133a7d4c5\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-jvw5j" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.044284 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsb8q\" (UniqueName: \"kubernetes.io/projected/5044cf86-f557-41d4-b6c0-a41a668ac999-kube-api-access-vsb8q\") pod \"heat-operator-controller-manager-cf99c678f-7v65r\" (UID: \"5044cf86-f557-41d4-b6c0-a41a668ac999\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-7v65r" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.047100 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8khs\" (UniqueName: \"kubernetes.io/projected/cd83ed19-023d-43c2-92db-d290499db3d4-kube-api-access-d8khs\") pod \"horizon-operator-controller-manager-78bc7f9bd9-55qzx\" (UID: \"cd83ed19-023d-43c2-92db-d290499db3d4\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-55qzx" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.050961 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-qjqd2"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.054192 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxwh7\" (UniqueName: \"kubernetes.io/projected/234d2ae5-7589-44cc-83f4-b0ee8a91940a-kube-api-access-sxwh7\") pod \"keystone-operator-controller-manager-7c789f89c6-wjf62\" (UID: \"234d2ae5-7589-44cc-83f4-b0ee8a91940a\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wjf62" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.054393 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-knk7m"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.055194 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-knk7m" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.055924 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t6l9\" (UniqueName: \"kubernetes.io/projected/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-kube-api-access-7t6l9\") pod \"infra-operator-controller-manager-f7fcc58b9-dsqtf\" (UID: \"ea6739c2-185a-43e7-8fcf-0b2ae31957a0\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.065562 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-mbvwj" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.067481 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-knk7m"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.067723 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-7v65r" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.130038 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-55qzx" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.137788 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9d52\" (UniqueName: \"kubernetes.io/projected/376afe52-646d-44b7-b32e-ce6cd6dc21a6-kube-api-access-q9d52\") pod \"manila-operator-controller-manager-67d996989d-t5fsn\" (UID: \"376afe52-646d-44b7-b32e-ce6cd6dc21a6\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-t5fsn" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.138365 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhplh\" (UniqueName: \"kubernetes.io/projected/1793465e-1273-4250-a238-c99798788618-kube-api-access-rhplh\") pod \"mariadb-operator-controller-manager-7b6bfb6475-hlzm6\" (UID: \"1793465e-1273-4250-a238-c99798788618\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-hlzm6" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.138627 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxgmx\" (UniqueName: \"kubernetes.io/projected/dfb10d33-c4f1-4287-be83-dff835c733ba-kube-api-access-vxgmx\") pod \"neutron-operator-controller-manager-54688575f-qjqd2\" (UID: \"dfb10d33-c4f1-4287-be83-dff835c733ba\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-qjqd2" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.154287 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-54rpl"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.155502 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-54rpl" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.161658 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-8znlx" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.165301 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.180917 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.214304 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wjf62" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.215643 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.216433 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-jvw5j" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.228926 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxgmx\" (UniqueName: \"kubernetes.io/projected/dfb10d33-c4f1-4287-be83-dff835c733ba-kube-api-access-vxgmx\") pod \"neutron-operator-controller-manager-54688575f-qjqd2\" (UID: \"dfb10d33-c4f1-4287-be83-dff835c733ba\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-qjqd2" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.230872 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9d52\" (UniqueName: \"kubernetes.io/projected/376afe52-646d-44b7-b32e-ce6cd6dc21a6-kube-api-access-q9d52\") pod \"manila-operator-controller-manager-67d996989d-t5fsn\" (UID: \"376afe52-646d-44b7-b32e-ce6cd6dc21a6\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-t5fsn" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.237355 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-h6dst" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.237773 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhplh\" (UniqueName: \"kubernetes.io/projected/1793465e-1273-4250-a238-c99798788618-kube-api-access-rhplh\") pod \"mariadb-operator-controller-manager-7b6bfb6475-hlzm6\" (UID: \"1793465e-1273-4250-a238-c99798788618\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-hlzm6" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.242133 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-54rpl"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.242776 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcdk4\" (UniqueName: \"kubernetes.io/projected/8307ba19-5fc4-4cfc-b3cd-cafe5eac9cb9-kube-api-access-pcdk4\") pod \"nova-operator-controller-manager-74b6b5dc96-knk7m\" (UID: \"8307ba19-5fc4-4cfc-b3cd-cafe5eac9cb9\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-knk7m" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.262420 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-mqndr"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.263496 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-mqndr" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.267519 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-8kxt9" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.290465 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-jdn6k"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.293882 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jdn6k" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.300575 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-l5f87" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.304482 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-hlzm6" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.340788 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-t5fsn" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.348184 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m24nn\" (UniqueName: \"kubernetes.io/projected/9244686e-175e-45f9-9eb7-23621cd1f3cd-kube-api-access-m24nn\") pod \"openstack-baremetal-operator-controller-manager-7b4cc4776948grv\" (UID: \"9244686e-175e-45f9-9eb7-23621cd1f3cd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.348345 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc6l4\" (UniqueName: \"kubernetes.io/projected/ecc17c18-7695-4d22-9a95-bcac51800d60-kube-api-access-lc6l4\") pod \"octavia-operator-controller-manager-5d86c7ddb7-54rpl\" (UID: \"ecc17c18-7695-4d22-9a95-bcac51800d60\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-54rpl" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.348404 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9244686e-175e-45f9-9eb7-23621cd1f3cd-cert\") pod \"openstack-baremetal-operator-controller-manager-7b4cc4776948grv\" (UID: \"9244686e-175e-45f9-9eb7-23621cd1f3cd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.348459 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcdk4\" (UniqueName: \"kubernetes.io/projected/8307ba19-5fc4-4cfc-b3cd-cafe5eac9cb9-kube-api-access-pcdk4\") pod \"nova-operator-controller-manager-74b6b5dc96-knk7m\" (UID: \"8307ba19-5fc4-4cfc-b3cd-cafe5eac9cb9\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-knk7m" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.358543 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-qjqd2" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.386847 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-zkx7c"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.387848 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-zkx7c" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.399198 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-svkrk" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.412124 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcdk4\" (UniqueName: \"kubernetes.io/projected/8307ba19-5fc4-4cfc-b3cd-cafe5eac9cb9-kube-api-access-pcdk4\") pod \"nova-operator-controller-manager-74b6b5dc96-knk7m\" (UID: \"8307ba19-5fc4-4cfc-b3cd-cafe5eac9cb9\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-knk7m" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.460625 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-mqndr"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.471867 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m24nn\" (UniqueName: \"kubernetes.io/projected/9244686e-175e-45f9-9eb7-23621cd1f3cd-kube-api-access-m24nn\") pod \"openstack-baremetal-operator-controller-manager-7b4cc4776948grv\" (UID: \"9244686e-175e-45f9-9eb7-23621cd1f3cd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.471978 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fct2d\" (UniqueName: \"kubernetes.io/projected/808b8753-0a20-419b-8b04-dcbccaa2d77e-kube-api-access-fct2d\") pod \"placement-operator-controller-manager-648564c9fc-jdn6k\" (UID: \"808b8753-0a20-419b-8b04-dcbccaa2d77e\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jdn6k" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.472054 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc6l4\" (UniqueName: \"kubernetes.io/projected/ecc17c18-7695-4d22-9a95-bcac51800d60-kube-api-access-lc6l4\") pod \"octavia-operator-controller-manager-5d86c7ddb7-54rpl\" (UID: \"ecc17c18-7695-4d22-9a95-bcac51800d60\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-54rpl" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.472139 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9244686e-175e-45f9-9eb7-23621cd1f3cd-cert\") pod \"openstack-baremetal-operator-controller-manager-7b4cc4776948grv\" (UID: \"9244686e-175e-45f9-9eb7-23621cd1f3cd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.472182 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74rgc\" (UniqueName: \"kubernetes.io/projected/e0cef8e2-a392-4612-97c6-17c611b2a44e-kube-api-access-74rgc\") pod \"swift-operator-controller-manager-9b9ff9f4d-mqndr\" (UID: \"e0cef8e2-a392-4612-97c6-17c611b2a44e\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-mqndr" Mar 01 09:24:43 crc kubenswrapper[4792]: E0301 09:24:43.472693 4792 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 01 09:24:43 crc kubenswrapper[4792]: E0301 09:24:43.472735 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9244686e-175e-45f9-9eb7-23621cd1f3cd-cert podName:9244686e-175e-45f9-9eb7-23621cd1f3cd nodeName:}" failed. No retries permitted until 2026-03-01 09:24:43.972719927 +0000 UTC m=+1013.214599114 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9244686e-175e-45f9-9eb7-23621cd1f3cd-cert") pod "openstack-baremetal-operator-controller-manager-7b4cc4776948grv" (UID: "9244686e-175e-45f9-9eb7-23621cd1f3cd") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.496159 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.520230 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-jdn6k"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.529254 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-zkx7c"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.553148 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-knk7m" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.557506 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-jpxwz"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.559656 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-jpxwz" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.565145 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-bcnns"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.565936 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-bcnns" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.574112 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbz6d\" (UniqueName: \"kubernetes.io/projected/3d38195c-e4ff-49cf-9592-e9f52d73f2df-kube-api-access-hbz6d\") pod \"ovn-operator-controller-manager-75684d597f-zkx7c\" (UID: \"3d38195c-e4ff-49cf-9592-e9f52d73f2df\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-zkx7c" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.574218 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fct2d\" (UniqueName: \"kubernetes.io/projected/808b8753-0a20-419b-8b04-dcbccaa2d77e-kube-api-access-fct2d\") pod \"placement-operator-controller-manager-648564c9fc-jdn6k\" (UID: \"808b8753-0a20-419b-8b04-dcbccaa2d77e\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jdn6k" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.574262 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-dsqtf\" (UID: \"ea6739c2-185a-43e7-8fcf-0b2ae31957a0\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.574333 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74rgc\" (UniqueName: \"kubernetes.io/projected/e0cef8e2-a392-4612-97c6-17c611b2a44e-kube-api-access-74rgc\") pod \"swift-operator-controller-manager-9b9ff9f4d-mqndr\" (UID: \"e0cef8e2-a392-4612-97c6-17c611b2a44e\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-mqndr" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.584405 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-x7m9p" Mar 01 09:24:43 crc kubenswrapper[4792]: E0301 09:24:43.584557 4792 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 01 09:24:43 crc kubenswrapper[4792]: E0301 09:24:43.584609 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-cert podName:ea6739c2-185a-43e7-8fcf-0b2ae31957a0 nodeName:}" failed. No retries permitted until 2026-03-01 09:24:44.584592481 +0000 UTC m=+1013.826471678 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-cert") pod "infra-operator-controller-manager-f7fcc58b9-dsqtf" (UID: "ea6739c2-185a-43e7-8fcf-0b2ae31957a0") : secret "infra-operator-webhook-server-cert" not found Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.584691 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-jpxwz"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.585320 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-ssz2h" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.585694 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc6l4\" (UniqueName: \"kubernetes.io/projected/ecc17c18-7695-4d22-9a95-bcac51800d60-kube-api-access-lc6l4\") pod \"octavia-operator-controller-manager-5d86c7ddb7-54rpl\" (UID: \"ecc17c18-7695-4d22-9a95-bcac51800d60\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-54rpl" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.589102 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-bcnns"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.610584 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-64lkf"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.611474 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-64lkf" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.625097 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-64lkf"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.637325 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-ts4wr" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.650485 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-54rpl" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.675579 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l9fq\" (UniqueName: \"kubernetes.io/projected/4fe8270e-a46d-40bc-8d24-a4585b196f5e-kube-api-access-8l9fq\") pod \"telemetry-operator-controller-manager-5fdb694969-jpxwz\" (UID: \"4fe8270e-a46d-40bc-8d24-a4585b196f5e\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-jpxwz" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.675664 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmrmm\" (UniqueName: \"kubernetes.io/projected/2970c60c-7b03-4667-99e4-08c094cdbfc2-kube-api-access-dmrmm\") pod \"test-operator-controller-manager-55b5ff4dbb-bcnns\" (UID: \"2970c60c-7b03-4667-99e4-08c094cdbfc2\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-bcnns" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.675710 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m24nn\" (UniqueName: \"kubernetes.io/projected/9244686e-175e-45f9-9eb7-23621cd1f3cd-kube-api-access-m24nn\") pod \"openstack-baremetal-operator-controller-manager-7b4cc4776948grv\" (UID: \"9244686e-175e-45f9-9eb7-23621cd1f3cd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.675805 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbz6d\" (UniqueName: \"kubernetes.io/projected/3d38195c-e4ff-49cf-9592-e9f52d73f2df-kube-api-access-hbz6d\") pod \"ovn-operator-controller-manager-75684d597f-zkx7c\" (UID: \"3d38195c-e4ff-49cf-9592-e9f52d73f2df\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-zkx7c" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.695655 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fct2d\" (UniqueName: \"kubernetes.io/projected/808b8753-0a20-419b-8b04-dcbccaa2d77e-kube-api-access-fct2d\") pod \"placement-operator-controller-manager-648564c9fc-jdn6k\" (UID: \"808b8753-0a20-419b-8b04-dcbccaa2d77e\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jdn6k" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.697311 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.698188 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.698727 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74rgc\" (UniqueName: \"kubernetes.io/projected/e0cef8e2-a392-4612-97c6-17c611b2a44e-kube-api-access-74rgc\") pod \"swift-operator-controller-manager-9b9ff9f4d-mqndr\" (UID: \"e0cef8e2-a392-4612-97c6-17c611b2a44e\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-mqndr" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.702528 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-chzwj" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.702789 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.702993 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.712744 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.718219 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbz6d\" (UniqueName: \"kubernetes.io/projected/3d38195c-e4ff-49cf-9592-e9f52d73f2df-kube-api-access-hbz6d\") pod \"ovn-operator-controller-manager-75684d597f-zkx7c\" (UID: \"3d38195c-e4ff-49cf-9592-e9f52d73f2df\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-zkx7c" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.732778 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-72srw"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.768391 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-zkx7c" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.791976 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmrmm\" (UniqueName: \"kubernetes.io/projected/2970c60c-7b03-4667-99e4-08c094cdbfc2-kube-api-access-dmrmm\") pod \"test-operator-controller-manager-55b5ff4dbb-bcnns\" (UID: \"2970c60c-7b03-4667-99e4-08c094cdbfc2\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-bcnns" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.792491 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l9fq\" (UniqueName: \"kubernetes.io/projected/4fe8270e-a46d-40bc-8d24-a4585b196f5e-kube-api-access-8l9fq\") pod \"telemetry-operator-controller-manager-5fdb694969-jpxwz\" (UID: \"4fe8270e-a46d-40bc-8d24-a4585b196f5e\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-jpxwz" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.792544 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jwgz\" (UniqueName: \"kubernetes.io/projected/e45ebab9-87d5-4b2f-b3d1-f1832864584d-kube-api-access-7jwgz\") pod \"watcher-operator-controller-manager-bccc79885-64lkf\" (UID: \"e45ebab9-87d5-4b2f-b3d1-f1832864584d\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-64lkf" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.897307 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-metrics-certs\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.897372 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh2wh\" (UniqueName: \"kubernetes.io/projected/d1d3783f-78e9-461a-916a-5a46e3083e70-kube-api-access-vh2wh\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.897450 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jwgz\" (UniqueName: \"kubernetes.io/projected/e45ebab9-87d5-4b2f-b3d1-f1832864584d-kube-api-access-7jwgz\") pod \"watcher-operator-controller-manager-bccc79885-64lkf\" (UID: \"e45ebab9-87d5-4b2f-b3d1-f1832864584d\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-64lkf" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.897479 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-webhook-certs\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.918599 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l9fq\" (UniqueName: \"kubernetes.io/projected/4fe8270e-a46d-40bc-8d24-a4585b196f5e-kube-api-access-8l9fq\") pod \"telemetry-operator-controller-manager-5fdb694969-jpxwz\" (UID: \"4fe8270e-a46d-40bc-8d24-a4585b196f5e\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-jpxwz" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.927785 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmrmm\" (UniqueName: \"kubernetes.io/projected/2970c60c-7b03-4667-99e4-08c094cdbfc2-kube-api-access-dmrmm\") pod \"test-operator-controller-manager-55b5ff4dbb-bcnns\" (UID: \"2970c60c-7b03-4667-99e4-08c094cdbfc2\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-bcnns" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.937684 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jwgz\" (UniqueName: \"kubernetes.io/projected/e45ebab9-87d5-4b2f-b3d1-f1832864584d-kube-api-access-7jwgz\") pod \"watcher-operator-controller-manager-bccc79885-64lkf\" (UID: \"e45ebab9-87d5-4b2f-b3d1-f1832864584d\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-64lkf" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.948715 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-jpxwz" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.965960 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5l9m"] Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.966714 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5l9m" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.967327 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-mqndr" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.977412 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-vr24h" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.978924 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jdn6k" Mar 01 09:24:43 crc kubenswrapper[4792]: I0301 09:24:43.989121 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-64lkf" Mar 01 09:24:44 crc kubenswrapper[4792]: I0301 09:24:43.996819 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5l9m"] Mar 01 09:24:44 crc kubenswrapper[4792]: I0301 09:24:44.010221 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-webhook-certs\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:44 crc kubenswrapper[4792]: I0301 09:24:44.010414 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9244686e-175e-45f9-9eb7-23621cd1f3cd-cert\") pod \"openstack-baremetal-operator-controller-manager-7b4cc4776948grv\" (UID: \"9244686e-175e-45f9-9eb7-23621cd1f3cd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv" Mar 01 09:24:44 crc kubenswrapper[4792]: I0301 09:24:44.010481 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-metrics-certs\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:44 crc kubenswrapper[4792]: I0301 09:24:44.010522 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh2wh\" (UniqueName: \"kubernetes.io/projected/d1d3783f-78e9-461a-916a-5a46e3083e70-kube-api-access-vh2wh\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:44 crc kubenswrapper[4792]: E0301 09:24:44.010981 4792 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 01 09:24:44 crc kubenswrapper[4792]: E0301 09:24:44.011020 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-webhook-certs podName:d1d3783f-78e9-461a-916a-5a46e3083e70 nodeName:}" failed. No retries permitted until 2026-03-01 09:24:44.511006443 +0000 UTC m=+1013.752885640 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-webhook-certs") pod "openstack-operator-controller-manager-864b865b94-5ndlx" (UID: "d1d3783f-78e9-461a-916a-5a46e3083e70") : secret "webhook-server-cert" not found Mar 01 09:24:44 crc kubenswrapper[4792]: E0301 09:24:44.011060 4792 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 01 09:24:44 crc kubenswrapper[4792]: E0301 09:24:44.011078 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9244686e-175e-45f9-9eb7-23621cd1f3cd-cert podName:9244686e-175e-45f9-9eb7-23621cd1f3cd nodeName:}" failed. No retries permitted until 2026-03-01 09:24:45.011072125 +0000 UTC m=+1014.252951322 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9244686e-175e-45f9-9eb7-23621cd1f3cd-cert") pod "openstack-baremetal-operator-controller-manager-7b4cc4776948grv" (UID: "9244686e-175e-45f9-9eb7-23621cd1f3cd") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 01 09:24:44 crc kubenswrapper[4792]: E0301 09:24:44.011109 4792 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 01 09:24:44 crc kubenswrapper[4792]: E0301 09:24:44.011127 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-metrics-certs podName:d1d3783f-78e9-461a-916a-5a46e3083e70 nodeName:}" failed. No retries permitted until 2026-03-01 09:24:44.511122146 +0000 UTC m=+1013.753001333 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-metrics-certs") pod "openstack-operator-controller-manager-864b865b94-5ndlx" (UID: "d1d3783f-78e9-461a-916a-5a46e3083e70") : secret "metrics-server-cert" not found Mar 01 09:24:44 crc kubenswrapper[4792]: I0301 09:24:44.050530 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh2wh\" (UniqueName: \"kubernetes.io/projected/d1d3783f-78e9-461a-916a-5a46e3083e70-kube-api-access-vh2wh\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:44 crc kubenswrapper[4792]: I0301 09:24:44.113608 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsppl\" (UniqueName: \"kubernetes.io/projected/1ecd6b07-eda9-41d6-90af-6471699ff808-kube-api-access-nsppl\") pod \"rabbitmq-cluster-operator-manager-668c99d594-r5l9m\" (UID: \"1ecd6b07-eda9-41d6-90af-6471699ff808\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5l9m" Mar 01 09:24:44 crc kubenswrapper[4792]: I0301 09:24:44.216294 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsppl\" (UniqueName: \"kubernetes.io/projected/1ecd6b07-eda9-41d6-90af-6471699ff808-kube-api-access-nsppl\") pod \"rabbitmq-cluster-operator-manager-668c99d594-r5l9m\" (UID: \"1ecd6b07-eda9-41d6-90af-6471699ff808\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5l9m" Mar 01 09:24:44 crc kubenswrapper[4792]: I0301 09:24:44.225314 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-bcnns" Mar 01 09:24:44 crc kubenswrapper[4792]: I0301 09:24:44.259750 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsppl\" (UniqueName: \"kubernetes.io/projected/1ecd6b07-eda9-41d6-90af-6471699ff808-kube-api-access-nsppl\") pod \"rabbitmq-cluster-operator-manager-668c99d594-r5l9m\" (UID: \"1ecd6b07-eda9-41d6-90af-6471699ff808\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5l9m" Mar 01 09:24:44 crc kubenswrapper[4792]: I0301 09:24:44.371168 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5l9m" Mar 01 09:24:44 crc kubenswrapper[4792]: I0301 09:24:44.491335 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jlnsb"] Mar 01 09:24:44 crc kubenswrapper[4792]: I0301 09:24:44.497139 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-ggspg"] Mar 01 09:24:44 crc kubenswrapper[4792]: I0301 09:24:44.527143 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-72srw" event={"ID":"bf1f37ea-a566-4dfd-b45b-02f284f19ce3","Type":"ContainerStarted","Data":"ba10b617caf1f37d826b32cdf76709c174e0681e6bf47c736021b5c1eddff1d1"} Mar 01 09:24:44 crc kubenswrapper[4792]: I0301 09:24:44.527457 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-metrics-certs\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:44 crc kubenswrapper[4792]: I0301 09:24:44.527526 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-webhook-certs\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:44 crc kubenswrapper[4792]: E0301 09:24:44.529306 4792 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 01 09:24:44 crc kubenswrapper[4792]: E0301 09:24:44.529361 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-webhook-certs podName:d1d3783f-78e9-461a-916a-5a46e3083e70 nodeName:}" failed. No retries permitted until 2026-03-01 09:24:45.529346841 +0000 UTC m=+1014.771226038 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-webhook-certs") pod "openstack-operator-controller-manager-864b865b94-5ndlx" (UID: "d1d3783f-78e9-461a-916a-5a46e3083e70") : secret "webhook-server-cert" not found Mar 01 09:24:44 crc kubenswrapper[4792]: E0301 09:24:44.536006 4792 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 01 09:24:44 crc kubenswrapper[4792]: E0301 09:24:44.536051 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-metrics-certs podName:d1d3783f-78e9-461a-916a-5a46e3083e70 nodeName:}" failed. No retries permitted until 2026-03-01 09:24:45.536039335 +0000 UTC m=+1014.777918532 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-metrics-certs") pod "openstack-operator-controller-manager-864b865b94-5ndlx" (UID: "d1d3783f-78e9-461a-916a-5a46e3083e70") : secret "metrics-server-cert" not found Mar 01 09:24:44 crc kubenswrapper[4792]: W0301 09:24:44.549210 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8741a141_0194_4eb2_956e_c41f4ffe1338.slice/crio-d5adb0b48ab275232d9602130968f0100e1c618b83eacd5c6a159607b3b0635a WatchSource:0}: Error finding container d5adb0b48ab275232d9602130968f0100e1c618b83eacd5c6a159607b3b0635a: Status 404 returned error can't find the container with id d5adb0b48ab275232d9602130968f0100e1c618b83eacd5c6a159607b3b0635a Mar 01 09:24:44 crc kubenswrapper[4792]: I0301 09:24:44.570130 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-7v65r"] Mar 01 09:24:44 crc kubenswrapper[4792]: I0301 09:24:44.629114 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-dsqtf\" (UID: \"ea6739c2-185a-43e7-8fcf-0b2ae31957a0\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf" Mar 01 09:24:44 crc kubenswrapper[4792]: E0301 09:24:44.629336 4792 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 01 09:24:44 crc kubenswrapper[4792]: E0301 09:24:44.629382 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-cert podName:ea6739c2-185a-43e7-8fcf-0b2ae31957a0 nodeName:}" failed. No retries permitted until 2026-03-01 09:24:46.629367955 +0000 UTC m=+1015.871247152 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-cert") pod "infra-operator-controller-manager-f7fcc58b9-dsqtf" (UID: "ea6739c2-185a-43e7-8fcf-0b2ae31957a0") : secret "infra-operator-webhook-server-cert" not found Mar 01 09:24:45 crc kubenswrapper[4792]: I0301 09:24:45.032887 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9244686e-175e-45f9-9eb7-23621cd1f3cd-cert\") pod \"openstack-baremetal-operator-controller-manager-7b4cc4776948grv\" (UID: \"9244686e-175e-45f9-9eb7-23621cd1f3cd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv" Mar 01 09:24:45 crc kubenswrapper[4792]: E0301 09:24:45.033922 4792 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 01 09:24:45 crc kubenswrapper[4792]: E0301 09:24:45.033980 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9244686e-175e-45f9-9eb7-23621cd1f3cd-cert podName:9244686e-175e-45f9-9eb7-23621cd1f3cd nodeName:}" failed. No retries permitted until 2026-03-01 09:24:47.033962642 +0000 UTC m=+1016.275841839 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9244686e-175e-45f9-9eb7-23621cd1f3cd-cert") pod "openstack-baremetal-operator-controller-manager-7b4cc4776948grv" (UID: "9244686e-175e-45f9-9eb7-23621cd1f3cd") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 01 09:24:45 crc kubenswrapper[4792]: I0301 09:24:45.533641 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jlnsb" event={"ID":"8741a141-0194-4eb2-956e-c41f4ffe1338","Type":"ContainerStarted","Data":"d5adb0b48ab275232d9602130968f0100e1c618b83eacd5c6a159607b3b0635a"} Mar 01 09:24:45 crc kubenswrapper[4792]: I0301 09:24:45.534704 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-ggspg" event={"ID":"b9e3fd6b-e3e2-4380-b8d7-900891df562a","Type":"ContainerStarted","Data":"aa3b1164026ce2782aba927f155fc7e25c18b5439d62dda12c446f84dd90c59e"} Mar 01 09:24:45 crc kubenswrapper[4792]: I0301 09:24:45.540026 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-webhook-certs\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:45 crc kubenswrapper[4792]: I0301 09:24:45.540199 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-metrics-certs\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:45 crc kubenswrapper[4792]: E0301 09:24:45.540328 4792 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 01 09:24:45 crc kubenswrapper[4792]: E0301 09:24:45.540386 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-metrics-certs podName:d1d3783f-78e9-461a-916a-5a46e3083e70 nodeName:}" failed. No retries permitted until 2026-03-01 09:24:47.540372977 +0000 UTC m=+1016.782252174 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-metrics-certs") pod "openstack-operator-controller-manager-864b865b94-5ndlx" (UID: "d1d3783f-78e9-461a-916a-5a46e3083e70") : secret "metrics-server-cert" not found Mar 01 09:24:45 crc kubenswrapper[4792]: E0301 09:24:45.540731 4792 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 01 09:24:45 crc kubenswrapper[4792]: E0301 09:24:45.540772 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-webhook-certs podName:d1d3783f-78e9-461a-916a-5a46e3083e70 nodeName:}" failed. No retries permitted until 2026-03-01 09:24:47.540762166 +0000 UTC m=+1016.782641363 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-webhook-certs") pod "openstack-operator-controller-manager-864b865b94-5ndlx" (UID: "d1d3783f-78e9-461a-916a-5a46e3083e70") : secret "webhook-server-cert" not found Mar 01 09:24:46 crc kubenswrapper[4792]: I0301 09:24:46.081424 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-55qzx"] Mar 01 09:24:46 crc kubenswrapper[4792]: I0301 09:24:46.572888 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-knk7m"] Mar 01 09:24:46 crc kubenswrapper[4792]: I0301 09:24:46.581322 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-55qzx" event={"ID":"cd83ed19-023d-43c2-92db-d290499db3d4","Type":"ContainerStarted","Data":"80b1a2de06a8adc305b302ebf919b5861887ec48b367ad9db461383f27c4cd8b"} Mar 01 09:24:46 crc kubenswrapper[4792]: I0301 09:24:46.594346 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-9wzbh"] Mar 01 09:24:46 crc kubenswrapper[4792]: I0301 09:24:46.601173 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-jvw5j"] Mar 01 09:24:46 crc kubenswrapper[4792]: I0301 09:24:46.601817 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-7v65r" event={"ID":"5044cf86-f557-41d4-b6c0-a41a668ac999","Type":"ContainerStarted","Data":"e1b9edd3b1e3986848baa31be4313f49d7a1d702a243dcf54f6ab6440910eafa"} Mar 01 09:24:46 crc kubenswrapper[4792]: I0301 09:24:46.639001 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-t5fsn"] Mar 01 09:24:46 crc kubenswrapper[4792]: I0301 09:24:46.697003 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-dsqtf\" (UID: \"ea6739c2-185a-43e7-8fcf-0b2ae31957a0\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf" Mar 01 09:24:46 crc kubenswrapper[4792]: E0301 09:24:46.697127 4792 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 01 09:24:46 crc kubenswrapper[4792]: E0301 09:24:46.697173 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-cert podName:ea6739c2-185a-43e7-8fcf-0b2ae31957a0 nodeName:}" failed. No retries permitted until 2026-03-01 09:24:50.697157677 +0000 UTC m=+1019.939036874 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-cert") pod "infra-operator-controller-manager-f7fcc58b9-dsqtf" (UID: "ea6739c2-185a-43e7-8fcf-0b2ae31957a0") : secret "infra-operator-webhook-server-cert" not found Mar 01 09:24:46 crc kubenswrapper[4792]: I0301 09:24:46.711982 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-qjqd2"] Mar 01 09:24:46 crc kubenswrapper[4792]: I0301 09:24:46.781529 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-wjf62"] Mar 01 09:24:46 crc kubenswrapper[4792]: W0301 09:24:46.790935 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod234d2ae5_7589_44cc_83f4_b0ee8a91940a.slice/crio-960c1a0af0708f56ab6591644ca45dde3daa4e5598f8db495768e71b953665b9 WatchSource:0}: Error finding container 960c1a0af0708f56ab6591644ca45dde3daa4e5598f8db495768e71b953665b9: Status 404 returned error can't find the container with id 960c1a0af0708f56ab6591644ca45dde3daa4e5598f8db495768e71b953665b9 Mar 01 09:24:46 crc kubenswrapper[4792]: W0301 09:24:46.822441 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1793465e_1273_4250_a238_c99798788618.slice/crio-c3c82ce45a98801bb1bab97f4fbba734ecdffde5eff16e4d1ae09024fd6bba16 WatchSource:0}: Error finding container c3c82ce45a98801bb1bab97f4fbba734ecdffde5eff16e4d1ae09024fd6bba16: Status 404 returned error can't find the container with id c3c82ce45a98801bb1bab97f4fbba734ecdffde5eff16e4d1ae09024fd6bba16 Mar 01 09:24:46 crc kubenswrapper[4792]: I0301 09:24:46.823549 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-hlzm6"] Mar 01 09:24:46 crc kubenswrapper[4792]: I0301 09:24:46.921817 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-54rpl"] Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.015185 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-zkx7c"] Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.033817 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-mqndr"] Mar 01 09:24:47 crc kubenswrapper[4792]: W0301 09:24:47.035554 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d38195c_e4ff_49cf_9592_e9f52d73f2df.slice/crio-4e28b880eea6a7bbde96fd8689733c02a6616cf895cd282660b903a0ac994c42 WatchSource:0}: Error finding container 4e28b880eea6a7bbde96fd8689733c02a6616cf895cd282660b903a0ac994c42: Status 404 returned error can't find the container with id 4e28b880eea6a7bbde96fd8689733c02a6616cf895cd282660b903a0ac994c42 Mar 01 09:24:47 crc kubenswrapper[4792]: W0301 09:24:47.054268 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0cef8e2_a392_4612_97c6_17c611b2a44e.slice/crio-a5c2f882f8dcf8e2e6ceeec8064ddf7477298dd78dc8a91fcdbe032c8be5b4f2 WatchSource:0}: Error finding container a5c2f882f8dcf8e2e6ceeec8064ddf7477298dd78dc8a91fcdbe032c8be5b4f2: Status 404 returned error can't find the container with id a5c2f882f8dcf8e2e6ceeec8064ddf7477298dd78dc8a91fcdbe032c8be5b4f2 Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.099990 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-bcnns"] Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.118092 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9244686e-175e-45f9-9eb7-23621cd1f3cd-cert\") pod \"openstack-baremetal-operator-controller-manager-7b4cc4776948grv\" (UID: \"9244686e-175e-45f9-9eb7-23621cd1f3cd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv" Mar 01 09:24:47 crc kubenswrapper[4792]: E0301 09:24:47.118282 4792 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 01 09:24:47 crc kubenswrapper[4792]: E0301 09:24:47.118342 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9244686e-175e-45f9-9eb7-23621cd1f3cd-cert podName:9244686e-175e-45f9-9eb7-23621cd1f3cd nodeName:}" failed. No retries permitted until 2026-03-01 09:24:51.118323551 +0000 UTC m=+1020.360202748 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9244686e-175e-45f9-9eb7-23621cd1f3cd-cert") pod "openstack-baremetal-operator-controller-manager-7b4cc4776948grv" (UID: "9244686e-175e-45f9-9eb7-23621cd1f3cd") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.165034 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5l9m"] Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.176516 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-jpxwz"] Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.230097 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-64lkf"] Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.238677 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-jdn6k"] Mar 01 09:24:47 crc kubenswrapper[4792]: W0301 09:24:47.241632 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ecd6b07_eda9_41d6_90af_6471699ff808.slice/crio-d691d0399e469677d6ae9177034885648dffc0a75190738c79db56036d6e9782 WatchSource:0}: Error finding container d691d0399e469677d6ae9177034885648dffc0a75190738c79db56036d6e9782: Status 404 returned error can't find the container with id d691d0399e469677d6ae9177034885648dffc0a75190738c79db56036d6e9782 Mar 01 09:24:47 crc kubenswrapper[4792]: E0301 09:24:47.257524 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7jwgz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-bccc79885-64lkf_openstack-operators(e45ebab9-87d5-4b2f-b3d1-f1832864584d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 01 09:24:47 crc kubenswrapper[4792]: E0301 09:24:47.258723 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-64lkf" podUID="e45ebab9-87d5-4b2f-b3d1-f1832864584d" Mar 01 09:24:47 crc kubenswrapper[4792]: W0301 09:24:47.273759 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod808b8753_0a20_419b_8b04_dcbccaa2d77e.slice/crio-7f7e9f7e093cd111450949696d888963e0ce1a82c52679f952863446b592ba38 WatchSource:0}: Error finding container 7f7e9f7e093cd111450949696d888963e0ce1a82c52679f952863446b592ba38: Status 404 returned error can't find the container with id 7f7e9f7e093cd111450949696d888963e0ce1a82c52679f952863446b592ba38 Mar 01 09:24:47 crc kubenswrapper[4792]: E0301 09:24:47.276545 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fct2d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-648564c9fc-jdn6k_openstack-operators(808b8753-0a20-419b-8b04-dcbccaa2d77e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 01 09:24:47 crc kubenswrapper[4792]: E0301 09:24:47.278178 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jdn6k" podUID="808b8753-0a20-419b-8b04-dcbccaa2d77e" Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.631692 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-metrics-certs\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.631803 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-webhook-certs\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:47 crc kubenswrapper[4792]: E0301 09:24:47.631932 4792 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 01 09:24:47 crc kubenswrapper[4792]: E0301 09:24:47.632006 4792 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 01 09:24:47 crc kubenswrapper[4792]: E0301 09:24:47.632027 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-metrics-certs podName:d1d3783f-78e9-461a-916a-5a46e3083e70 nodeName:}" failed. No retries permitted until 2026-03-01 09:24:51.632007714 +0000 UTC m=+1020.873886901 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-metrics-certs") pod "openstack-operator-controller-manager-864b865b94-5ndlx" (UID: "d1d3783f-78e9-461a-916a-5a46e3083e70") : secret "metrics-server-cert" not found Mar 01 09:24:47 crc kubenswrapper[4792]: E0301 09:24:47.632069 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-webhook-certs podName:d1d3783f-78e9-461a-916a-5a46e3083e70 nodeName:}" failed. No retries permitted until 2026-03-01 09:24:51.632047955 +0000 UTC m=+1020.873927152 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-webhook-certs") pod "openstack-operator-controller-manager-864b865b94-5ndlx" (UID: "d1d3783f-78e9-461a-916a-5a46e3083e70") : secret "webhook-server-cert" not found Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.638738 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-zkx7c" event={"ID":"3d38195c-e4ff-49cf-9592-e9f52d73f2df","Type":"ContainerStarted","Data":"4e28b880eea6a7bbde96fd8689733c02a6616cf895cd282660b903a0ac994c42"} Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.642689 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-mqndr" event={"ID":"e0cef8e2-a392-4612-97c6-17c611b2a44e","Type":"ContainerStarted","Data":"a5c2f882f8dcf8e2e6ceeec8064ddf7477298dd78dc8a91fcdbe032c8be5b4f2"} Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.668570 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-jvw5j" event={"ID":"2af4993f-9ba9-4f7a-a31e-2bd133a7d4c5","Type":"ContainerStarted","Data":"135a878ebf26e55ab721a8b3c8811e6af8d3324717b4b08c3ff36c2f50a806d5"} Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.671451 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-54rpl" event={"ID":"ecc17c18-7695-4d22-9a95-bcac51800d60","Type":"ContainerStarted","Data":"7699d89194923baa5130b849d85533c2a4f80b67c3079cadb83ed7942a341128"} Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.672594 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wjf62" event={"ID":"234d2ae5-7589-44cc-83f4-b0ee8a91940a","Type":"ContainerStarted","Data":"960c1a0af0708f56ab6591644ca45dde3daa4e5598f8db495768e71b953665b9"} Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.673882 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jdn6k" event={"ID":"808b8753-0a20-419b-8b04-dcbccaa2d77e","Type":"ContainerStarted","Data":"7f7e9f7e093cd111450949696d888963e0ce1a82c52679f952863446b592ba38"} Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.675008 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-jpxwz" event={"ID":"4fe8270e-a46d-40bc-8d24-a4585b196f5e","Type":"ContainerStarted","Data":"94c097d9c3e6a279e675034e53573d9abf36ff13f40f2a4f7bd46fedbbd4a885"} Mar 01 09:24:47 crc kubenswrapper[4792]: E0301 09:24:47.675154 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e\\\"\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jdn6k" podUID="808b8753-0a20-419b-8b04-dcbccaa2d77e" Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.677027 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-bcnns" event={"ID":"2970c60c-7b03-4667-99e4-08c094cdbfc2","Type":"ContainerStarted","Data":"17aab2391d2d662cc1edf1caabcc155551d4b00dd50fb74030ede2e96fb63e50"} Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.679505 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-64lkf" event={"ID":"e45ebab9-87d5-4b2f-b3d1-f1832864584d","Type":"ContainerStarted","Data":"ee959300b9c941a04eb70d52b1531fd8c48a4983f714abc59aa4c8f3d9148c49"} Mar 01 09:24:47 crc kubenswrapper[4792]: E0301 09:24:47.680962 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-64lkf" podUID="e45ebab9-87d5-4b2f-b3d1-f1832864584d" Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.681727 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5l9m" event={"ID":"1ecd6b07-eda9-41d6-90af-6471699ff808","Type":"ContainerStarted","Data":"d691d0399e469677d6ae9177034885648dffc0a75190738c79db56036d6e9782"} Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.683845 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-knk7m" event={"ID":"8307ba19-5fc4-4cfc-b3cd-cafe5eac9cb9","Type":"ContainerStarted","Data":"dfe782533c3722521e20eace8fe429e3a265f6fb32963109d2cf78121d3153ac"} Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.685708 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-t5fsn" event={"ID":"376afe52-646d-44b7-b32e-ce6cd6dc21a6","Type":"ContainerStarted","Data":"609b87890865e6905eeff0b5ac5b546e05c5efef3c92fab1622c004625e51d14"} Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.689396 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-qjqd2" event={"ID":"dfb10d33-c4f1-4287-be83-dff835c733ba","Type":"ContainerStarted","Data":"00a2c32629d94ae8fd350196ced6e1e95c4402f4a0916b2f1ad846c6db2f17aa"} Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.693680 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9wzbh" event={"ID":"02dd5cc0-c44b-4ede-972b-9d26c9c54100","Type":"ContainerStarted","Data":"69a665b2ed8de862fa907844a28352a770d035c6e90eac4279c14d89c6409a8f"} Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.697575 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-hlzm6" event={"ID":"1793465e-1273-4250-a238-c99798788618","Type":"ContainerStarted","Data":"c3c82ce45a98801bb1bab97f4fbba734ecdffde5eff16e4d1ae09024fd6bba16"} Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.804735 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-29ht5" Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.804783 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-29ht5" Mar 01 09:24:47 crc kubenswrapper[4792]: I0301 09:24:47.869483 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-29ht5" Mar 01 09:24:48 crc kubenswrapper[4792]: E0301 09:24:48.729147 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e\\\"\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jdn6k" podUID="808b8753-0a20-419b-8b04-dcbccaa2d77e" Mar 01 09:24:48 crc kubenswrapper[4792]: E0301 09:24:48.729514 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-64lkf" podUID="e45ebab9-87d5-4b2f-b3d1-f1832864584d" Mar 01 09:24:48 crc kubenswrapper[4792]: I0301 09:24:48.875471 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vgfjs"] Mar 01 09:24:48 crc kubenswrapper[4792]: I0301 09:24:48.876919 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vgfjs" Mar 01 09:24:48 crc kubenswrapper[4792]: I0301 09:24:48.903749 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vgfjs"] Mar 01 09:24:48 crc kubenswrapper[4792]: I0301 09:24:48.967934 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a-utilities\") pod \"certified-operators-vgfjs\" (UID: \"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a\") " pod="openshift-marketplace/certified-operators-vgfjs" Mar 01 09:24:48 crc kubenswrapper[4792]: I0301 09:24:48.967988 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a-catalog-content\") pod \"certified-operators-vgfjs\" (UID: \"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a\") " pod="openshift-marketplace/certified-operators-vgfjs" Mar 01 09:24:48 crc kubenswrapper[4792]: I0301 09:24:48.968056 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xztgl\" (UniqueName: \"kubernetes.io/projected/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a-kube-api-access-xztgl\") pod \"certified-operators-vgfjs\" (UID: \"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a\") " pod="openshift-marketplace/certified-operators-vgfjs" Mar 01 09:24:49 crc kubenswrapper[4792]: I0301 09:24:48.996108 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-29ht5" Mar 01 09:24:49 crc kubenswrapper[4792]: I0301 09:24:49.068793 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a-utilities\") pod \"certified-operators-vgfjs\" (UID: \"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a\") " pod="openshift-marketplace/certified-operators-vgfjs" Mar 01 09:24:49 crc kubenswrapper[4792]: I0301 09:24:49.068830 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a-catalog-content\") pod \"certified-operators-vgfjs\" (UID: \"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a\") " pod="openshift-marketplace/certified-operators-vgfjs" Mar 01 09:24:49 crc kubenswrapper[4792]: I0301 09:24:49.068888 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xztgl\" (UniqueName: \"kubernetes.io/projected/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a-kube-api-access-xztgl\") pod \"certified-operators-vgfjs\" (UID: \"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a\") " pod="openshift-marketplace/certified-operators-vgfjs" Mar 01 09:24:49 crc kubenswrapper[4792]: I0301 09:24:49.069553 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a-utilities\") pod \"certified-operators-vgfjs\" (UID: \"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a\") " pod="openshift-marketplace/certified-operators-vgfjs" Mar 01 09:24:49 crc kubenswrapper[4792]: I0301 09:24:49.069571 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a-catalog-content\") pod \"certified-operators-vgfjs\" (UID: \"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a\") " pod="openshift-marketplace/certified-operators-vgfjs" Mar 01 09:24:49 crc kubenswrapper[4792]: I0301 09:24:49.108281 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xztgl\" (UniqueName: \"kubernetes.io/projected/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a-kube-api-access-xztgl\") pod \"certified-operators-vgfjs\" (UID: \"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a\") " pod="openshift-marketplace/certified-operators-vgfjs" Mar 01 09:24:49 crc kubenswrapper[4792]: I0301 09:24:49.205651 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vgfjs" Mar 01 09:24:50 crc kubenswrapper[4792]: I0301 09:24:50.000288 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vgfjs"] Mar 01 09:24:50 crc kubenswrapper[4792]: I0301 09:24:50.710052 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-dsqtf\" (UID: \"ea6739c2-185a-43e7-8fcf-0b2ae31957a0\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf" Mar 01 09:24:50 crc kubenswrapper[4792]: E0301 09:24:50.710425 4792 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 01 09:24:50 crc kubenswrapper[4792]: E0301 09:24:50.710481 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-cert podName:ea6739c2-185a-43e7-8fcf-0b2ae31957a0 nodeName:}" failed. No retries permitted until 2026-03-01 09:24:58.710467633 +0000 UTC m=+1027.952346830 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-cert") pod "infra-operator-controller-manager-f7fcc58b9-dsqtf" (UID: "ea6739c2-185a-43e7-8fcf-0b2ae31957a0") : secret "infra-operator-webhook-server-cert" not found Mar 01 09:24:50 crc kubenswrapper[4792]: I0301 09:24:50.798398 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgfjs" event={"ID":"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a","Type":"ContainerStarted","Data":"07b2ccb12f444a169347668634bc26575de0ebccb8ee9dc035b529cef91259bc"} Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.217733 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9244686e-175e-45f9-9eb7-23621cd1f3cd-cert\") pod \"openstack-baremetal-operator-controller-manager-7b4cc4776948grv\" (UID: \"9244686e-175e-45f9-9eb7-23621cd1f3cd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv" Mar 01 09:24:51 crc kubenswrapper[4792]: E0301 09:24:51.218006 4792 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 01 09:24:51 crc kubenswrapper[4792]: E0301 09:24:51.218052 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9244686e-175e-45f9-9eb7-23621cd1f3cd-cert podName:9244686e-175e-45f9-9eb7-23621cd1f3cd nodeName:}" failed. No retries permitted until 2026-03-01 09:24:59.218039116 +0000 UTC m=+1028.459918303 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9244686e-175e-45f9-9eb7-23621cd1f3cd-cert") pod "openstack-baremetal-operator-controller-manager-7b4cc4776948grv" (UID: "9244686e-175e-45f9-9eb7-23621cd1f3cd") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.302960 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-29ht5"] Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.303180 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-29ht5" podUID="15fa5cd2-57f2-4589-9947-c4a227fa68b6" containerName="registry-server" containerID="cri-o://8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1" gracePeriod=2 Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.532762 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-76rvg"] Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.535760 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-76rvg" Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.539023 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-76rvg"] Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.624326 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c9l7\" (UniqueName: \"kubernetes.io/projected/5ee54ba4-1afe-492b-a35b-23f0da447772-kube-api-access-4c9l7\") pod \"community-operators-76rvg\" (UID: \"5ee54ba4-1afe-492b-a35b-23f0da447772\") " pod="openshift-marketplace/community-operators-76rvg" Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.624692 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ee54ba4-1afe-492b-a35b-23f0da447772-catalog-content\") pod \"community-operators-76rvg\" (UID: \"5ee54ba4-1afe-492b-a35b-23f0da447772\") " pod="openshift-marketplace/community-operators-76rvg" Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.624771 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ee54ba4-1afe-492b-a35b-23f0da447772-utilities\") pod \"community-operators-76rvg\" (UID: \"5ee54ba4-1afe-492b-a35b-23f0da447772\") " pod="openshift-marketplace/community-operators-76rvg" Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.732742 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-metrics-certs\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.732791 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c9l7\" (UniqueName: \"kubernetes.io/projected/5ee54ba4-1afe-492b-a35b-23f0da447772-kube-api-access-4c9l7\") pod \"community-operators-76rvg\" (UID: \"5ee54ba4-1afe-492b-a35b-23f0da447772\") " pod="openshift-marketplace/community-operators-76rvg" Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.732836 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ee54ba4-1afe-492b-a35b-23f0da447772-catalog-content\") pod \"community-operators-76rvg\" (UID: \"5ee54ba4-1afe-492b-a35b-23f0da447772\") " pod="openshift-marketplace/community-operators-76rvg" Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.732871 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-webhook-certs\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.732933 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ee54ba4-1afe-492b-a35b-23f0da447772-utilities\") pod \"community-operators-76rvg\" (UID: \"5ee54ba4-1afe-492b-a35b-23f0da447772\") " pod="openshift-marketplace/community-operators-76rvg" Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.733884 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ee54ba4-1afe-492b-a35b-23f0da447772-utilities\") pod \"community-operators-76rvg\" (UID: \"5ee54ba4-1afe-492b-a35b-23f0da447772\") " pod="openshift-marketplace/community-operators-76rvg" Mar 01 09:24:51 crc kubenswrapper[4792]: E0301 09:24:51.734055 4792 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 01 09:24:51 crc kubenswrapper[4792]: E0301 09:24:51.734146 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-metrics-certs podName:d1d3783f-78e9-461a-916a-5a46e3083e70 nodeName:}" failed. No retries permitted until 2026-03-01 09:24:59.734127888 +0000 UTC m=+1028.976007085 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-metrics-certs") pod "openstack-operator-controller-manager-864b865b94-5ndlx" (UID: "d1d3783f-78e9-461a-916a-5a46e3083e70") : secret "metrics-server-cert" not found Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.735422 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ee54ba4-1afe-492b-a35b-23f0da447772-catalog-content\") pod \"community-operators-76rvg\" (UID: \"5ee54ba4-1afe-492b-a35b-23f0da447772\") " pod="openshift-marketplace/community-operators-76rvg" Mar 01 09:24:51 crc kubenswrapper[4792]: E0301 09:24:51.735563 4792 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 01 09:24:51 crc kubenswrapper[4792]: E0301 09:24:51.735628 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-webhook-certs podName:d1d3783f-78e9-461a-916a-5a46e3083e70 nodeName:}" failed. No retries permitted until 2026-03-01 09:24:59.735585834 +0000 UTC m=+1028.977465031 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-webhook-certs") pod "openstack-operator-controller-manager-864b865b94-5ndlx" (UID: "d1d3783f-78e9-461a-916a-5a46e3083e70") : secret "webhook-server-cert" not found Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.779095 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c9l7\" (UniqueName: \"kubernetes.io/projected/5ee54ba4-1afe-492b-a35b-23f0da447772-kube-api-access-4c9l7\") pod \"community-operators-76rvg\" (UID: \"5ee54ba4-1afe-492b-a35b-23f0da447772\") " pod="openshift-marketplace/community-operators-76rvg" Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.854636 4792 generic.go:334] "Generic (PLEG): container finished" podID="15fa5cd2-57f2-4589-9947-c4a227fa68b6" containerID="8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1" exitCode=0 Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.854691 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-29ht5" event={"ID":"15fa5cd2-57f2-4589-9947-c4a227fa68b6","Type":"ContainerDied","Data":"8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1"} Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.861768 4792 generic.go:334] "Generic (PLEG): container finished" podID="02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a" containerID="eab704b2ab894fb8cdec51e904b0f50429dd2cdba1e21487a963aeddb64f6526" exitCode=0 Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.862831 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgfjs" event={"ID":"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a","Type":"ContainerDied","Data":"eab704b2ab894fb8cdec51e904b0f50429dd2cdba1e21487a963aeddb64f6526"} Mar 01 09:24:51 crc kubenswrapper[4792]: I0301 09:24:51.872528 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-76rvg" Mar 01 09:24:52 crc kubenswrapper[4792]: I0301 09:24:52.946678 4792 scope.go:117] "RemoveContainer" containerID="ad33205b5c6776c36f5f90bc6d51a56bb6cf073bf39f5d634c13c03da022cc95" Mar 01 09:24:57 crc kubenswrapper[4792]: E0301 09:24:57.805704 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1 is running failed: container process not found" containerID="8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1" cmd=["grpc_health_probe","-addr=:50051"] Mar 01 09:24:57 crc kubenswrapper[4792]: E0301 09:24:57.806726 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1 is running failed: container process not found" containerID="8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1" cmd=["grpc_health_probe","-addr=:50051"] Mar 01 09:24:57 crc kubenswrapper[4792]: E0301 09:24:57.807110 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1 is running failed: container process not found" containerID="8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1" cmd=["grpc_health_probe","-addr=:50051"] Mar 01 09:24:57 crc kubenswrapper[4792]: E0301 09:24:57.807144 4792 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-29ht5" podUID="15fa5cd2-57f2-4589-9947-c4a227fa68b6" containerName="registry-server" Mar 01 09:24:58 crc kubenswrapper[4792]: I0301 09:24:58.735315 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-dsqtf\" (UID: \"ea6739c2-185a-43e7-8fcf-0b2ae31957a0\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf" Mar 01 09:24:58 crc kubenswrapper[4792]: I0301 09:24:58.743105 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ea6739c2-185a-43e7-8fcf-0b2ae31957a0-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-dsqtf\" (UID: \"ea6739c2-185a-43e7-8fcf-0b2ae31957a0\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf" Mar 01 09:24:58 crc kubenswrapper[4792]: I0301 09:24:58.760368 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-vn8ng" Mar 01 09:24:58 crc kubenswrapper[4792]: I0301 09:24:58.769334 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf" Mar 01 09:24:59 crc kubenswrapper[4792]: I0301 09:24:59.242429 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9244686e-175e-45f9-9eb7-23621cd1f3cd-cert\") pod \"openstack-baremetal-operator-controller-manager-7b4cc4776948grv\" (UID: \"9244686e-175e-45f9-9eb7-23621cd1f3cd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv" Mar 01 09:24:59 crc kubenswrapper[4792]: I0301 09:24:59.246458 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9244686e-175e-45f9-9eb7-23621cd1f3cd-cert\") pod \"openstack-baremetal-operator-controller-manager-7b4cc4776948grv\" (UID: \"9244686e-175e-45f9-9eb7-23621cd1f3cd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv" Mar 01 09:24:59 crc kubenswrapper[4792]: I0301 09:24:59.501828 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-h6dst" Mar 01 09:24:59 crc kubenswrapper[4792]: I0301 09:24:59.510986 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv" Mar 01 09:24:59 crc kubenswrapper[4792]: I0301 09:24:59.749514 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-metrics-certs\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:59 crc kubenswrapper[4792]: I0301 09:24:59.749608 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-webhook-certs\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:59 crc kubenswrapper[4792]: I0301 09:24:59.757187 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-metrics-certs\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:59 crc kubenswrapper[4792]: I0301 09:24:59.758755 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d1d3783f-78e9-461a-916a-5a46e3083e70-webhook-certs\") pod \"openstack-operator-controller-manager-864b865b94-5ndlx\" (UID: \"d1d3783f-78e9-461a-916a-5a46e3083e70\") " pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:24:59 crc kubenswrapper[4792]: I0301 09:24:59.946411 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-chzwj" Mar 01 09:24:59 crc kubenswrapper[4792]: I0301 09:24:59.954750 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:25:03 crc kubenswrapper[4792]: E0301 09:25:03.113275 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:1b9074a4ce16396d8bd2d30a475fc8c2f004f75a023e3eef8950661e89c0bcc6" Mar 01 09:25:03 crc kubenswrapper[4792]: E0301 09:25:03.113860 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:1b9074a4ce16396d8bd2d30a475fc8c2f004f75a023e3eef8950661e89c0bcc6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8l9fq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5fdb694969-jpxwz_openstack-operators(4fe8270e-a46d-40bc-8d24-a4585b196f5e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:25:03 crc kubenswrapper[4792]: E0301 09:25:03.115012 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-jpxwz" podUID="4fe8270e-a46d-40bc-8d24-a4585b196f5e" Mar 01 09:25:03 crc kubenswrapper[4792]: E0301 09:25:03.899273 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:2d59045b8d8e6f9c5483c4fdda7c5057218d553200dc4bcf26789980ac1d9abd" Mar 01 09:25:03 crc kubenswrapper[4792]: E0301 09:25:03.899525 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:2d59045b8d8e6f9c5483c4fdda7c5057218d553200dc4bcf26789980ac1d9abd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lc6l4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5d86c7ddb7-54rpl_openstack-operators(ecc17c18-7695-4d22-9a95-bcac51800d60): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:25:03 crc kubenswrapper[4792]: E0301 09:25:03.900712 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-54rpl" podUID="ecc17c18-7695-4d22-9a95-bcac51800d60" Mar 01 09:25:03 crc kubenswrapper[4792]: E0301 09:25:03.944244 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:2d59045b8d8e6f9c5483c4fdda7c5057218d553200dc4bcf26789980ac1d9abd\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-54rpl" podUID="ecc17c18-7695-4d22-9a95-bcac51800d60" Mar 01 09:25:03 crc kubenswrapper[4792]: E0301 09:25:03.944333 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:1b9074a4ce16396d8bd2d30a475fc8c2f004f75a023e3eef8950661e89c0bcc6\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-jpxwz" podUID="4fe8270e-a46d-40bc-8d24-a4585b196f5e" Mar 01 09:25:05 crc kubenswrapper[4792]: E0301 09:25:05.278744 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:9d03f03aa9a460f1fcac8875064808c03e4ecd0388873bbfb9c7dc58331f3968" Mar 01 09:25:05 crc kubenswrapper[4792]: E0301 09:25:05.278950 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:9d03f03aa9a460f1fcac8875064808c03e4ecd0388873bbfb9c7dc58331f3968,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dmrmm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-55b5ff4dbb-bcnns_openstack-operators(2970c60c-7b03-4667-99e4-08c094cdbfc2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:25:05 crc kubenswrapper[4792]: E0301 09:25:05.280126 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-bcnns" podUID="2970c60c-7b03-4667-99e4-08c094cdbfc2" Mar 01 09:25:05 crc kubenswrapper[4792]: E0301 09:25:05.952923 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:9d03f03aa9a460f1fcac8875064808c03e4ecd0388873bbfb9c7dc58331f3968\\\"\"" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-bcnns" podUID="2970c60c-7b03-4667-99e4-08c094cdbfc2" Mar 01 09:25:06 crc kubenswrapper[4792]: E0301 09:25:06.946498 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:f309cdea8084a4b1e8cbcd732d6e250fd93c55cfd1b48ba9026907c8591faab7" Mar 01 09:25:06 crc kubenswrapper[4792]: E0301 09:25:06.946658 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:f309cdea8084a4b1e8cbcd732d6e250fd93c55cfd1b48ba9026907c8591faab7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-74rgc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-9b9ff9f4d-mqndr_openstack-operators(e0cef8e2-a392-4612-97c6-17c611b2a44e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:25:06 crc kubenswrapper[4792]: E0301 09:25:06.947829 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-mqndr" podUID="e0cef8e2-a392-4612-97c6-17c611b2a44e" Mar 01 09:25:06 crc kubenswrapper[4792]: E0301 09:25:06.958639 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:f309cdea8084a4b1e8cbcd732d6e250fd93c55cfd1b48ba9026907c8591faab7\\\"\"" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-mqndr" podUID="e0cef8e2-a392-4612-97c6-17c611b2a44e" Mar 01 09:25:07 crc kubenswrapper[4792]: E0301 09:25:07.574401 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:5592ec4a6fbe2c832d1828b51af0b907e5d733d478b6f378a9b2f6d6cf0ac505" Mar 01 09:25:07 crc kubenswrapper[4792]: E0301 09:25:07.574607 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:5592ec4a6fbe2c832d1828b51af0b907e5d733d478b6f378a9b2f6d6cf0ac505,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rhplh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-7b6bfb6475-hlzm6_openstack-operators(1793465e-1273-4250-a238-c99798788618): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:25:07 crc kubenswrapper[4792]: E0301 09:25:07.575765 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-hlzm6" podUID="1793465e-1273-4250-a238-c99798788618" Mar 01 09:25:07 crc kubenswrapper[4792]: E0301 09:25:07.805587 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1 is running failed: container process not found" containerID="8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1" cmd=["grpc_health_probe","-addr=:50051"] Mar 01 09:25:07 crc kubenswrapper[4792]: E0301 09:25:07.822755 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1 is running failed: container process not found" containerID="8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1" cmd=["grpc_health_probe","-addr=:50051"] Mar 01 09:25:07 crc kubenswrapper[4792]: E0301 09:25:07.823333 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1 is running failed: container process not found" containerID="8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1" cmd=["grpc_health_probe","-addr=:50051"] Mar 01 09:25:07 crc kubenswrapper[4792]: E0301 09:25:07.823401 4792 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-29ht5" podUID="15fa5cd2-57f2-4589-9947-c4a227fa68b6" containerName="registry-server" Mar 01 09:25:07 crc kubenswrapper[4792]: E0301 09:25:07.976275 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:5592ec4a6fbe2c832d1828b51af0b907e5d733d478b6f378a9b2f6d6cf0ac505\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-hlzm6" podUID="1793465e-1273-4250-a238-c99798788618" Mar 01 09:25:10 crc kubenswrapper[4792]: E0301 09:25:10.364952 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:81e43c058d9af1d3bc31704010c630bc2a574c2ee388aa0ffe8c7b9621a7d051" Mar 01 09:25:10 crc kubenswrapper[4792]: E0301 09:25:10.365375 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:81e43c058d9af1d3bc31704010c630bc2a574c2ee388aa0ffe8c7b9621a7d051,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jqf5f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-64db6967f8-9wzbh_openstack-operators(02dd5cc0-c44b-4ede-972b-9d26c9c54100): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:25:10 crc kubenswrapper[4792]: E0301 09:25:10.366704 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9wzbh" podUID="02dd5cc0-c44b-4ede-972b-9d26c9c54100" Mar 01 09:25:10 crc kubenswrapper[4792]: E0301 09:25:10.902996 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:3f9b0446a124745439306dc3bb7faec8c02c0b6be33f788b9d455fa57fb60120" Mar 01 09:25:10 crc kubenswrapper[4792]: E0301 09:25:10.903167 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:3f9b0446a124745439306dc3bb7faec8c02c0b6be33f788b9d455fa57fb60120,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nj6r9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-6db6876945-ggspg_openstack-operators(b9e3fd6b-e3e2-4380-b8d7-900891df562a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:25:10 crc kubenswrapper[4792]: E0301 09:25:10.904559 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-ggspg" podUID="b9e3fd6b-e3e2-4380-b8d7-900891df562a" Mar 01 09:25:10 crc kubenswrapper[4792]: E0301 09:25:10.993793 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:3f9b0446a124745439306dc3bb7faec8c02c0b6be33f788b9d455fa57fb60120\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-ggspg" podUID="b9e3fd6b-e3e2-4380-b8d7-900891df562a" Mar 01 09:25:10 crc kubenswrapper[4792]: E0301 09:25:10.993793 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:81e43c058d9af1d3bc31704010c630bc2a574c2ee388aa0ffe8c7b9621a7d051\\\"\"" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9wzbh" podUID="02dd5cc0-c44b-4ede-972b-9d26c9c54100" Mar 01 09:25:12 crc kubenswrapper[4792]: E0301 09:25:12.817842 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:f1158ec4d879c4646eee4323bc501eba4d377beb2ad6fbe08ed30070c441ac26" Mar 01 09:25:12 crc kubenswrapper[4792]: E0301 09:25:12.818996 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:f1158ec4d879c4646eee4323bc501eba4d377beb2ad6fbe08ed30070c441ac26,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q9d52,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-67d996989d-t5fsn_openstack-operators(376afe52-646d-44b7-b32e-ce6cd6dc21a6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:25:12 crc kubenswrapper[4792]: E0301 09:25:12.821611 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-67d996989d-t5fsn" podUID="376afe52-646d-44b7-b32e-ce6cd6dc21a6" Mar 01 09:25:13 crc kubenswrapper[4792]: E0301 09:25:13.007189 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:f1158ec4d879c4646eee4323bc501eba4d377beb2ad6fbe08ed30070c441ac26\\\"\"" pod="openstack-operators/manila-operator-controller-manager-67d996989d-t5fsn" podUID="376afe52-646d-44b7-b32e-ce6cd6dc21a6" Mar 01 09:25:13 crc kubenswrapper[4792]: E0301 09:25:13.403666 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:508859beb0e5b69169393dbb0039dc03a9d4ba05f16f6ff74f9b25e19d446214" Mar 01 09:25:13 crc kubenswrapper[4792]: E0301 09:25:13.405432 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:508859beb0e5b69169393dbb0039dc03a9d4ba05f16f6ff74f9b25e19d446214,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lxh8z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-5d87c9d997-72srw_openstack-operators(bf1f37ea-a566-4dfd-b45b-02f284f19ce3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:25:13 crc kubenswrapper[4792]: E0301 09:25:13.407201 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-72srw" podUID="bf1f37ea-a566-4dfd-b45b-02f284f19ce3" Mar 01 09:25:13 crc kubenswrapper[4792]: E0301 09:25:13.972480 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:114c0dee0bab1d453890e9dcc7727de749055bdbea049384d5696e7ac8d78fe3" Mar 01 09:25:13 crc kubenswrapper[4792]: E0301 09:25:13.973074 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:114c0dee0bab1d453890e9dcc7727de749055bdbea049384d5696e7ac8d78fe3,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d8khs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-78bc7f9bd9-55qzx_openstack-operators(cd83ed19-023d-43c2-92db-d290499db3d4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:25:13 crc kubenswrapper[4792]: E0301 09:25:13.974217 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-55qzx" podUID="cd83ed19-023d-43c2-92db-d290499db3d4" Mar 01 09:25:14 crc kubenswrapper[4792]: I0301 09:25:14.017630 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-29ht5" event={"ID":"15fa5cd2-57f2-4589-9947-c4a227fa68b6","Type":"ContainerDied","Data":"5ee4a8e4800ff27037b52aa58b081311f95a6ba4d258c46fcee562038196b6f2"} Mar 01 09:25:14 crc kubenswrapper[4792]: I0301 09:25:14.017666 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ee4a8e4800ff27037b52aa58b081311f95a6ba4d258c46fcee562038196b6f2" Mar 01 09:25:14 crc kubenswrapper[4792]: E0301 09:25:14.019350 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:508859beb0e5b69169393dbb0039dc03a9d4ba05f16f6ff74f9b25e19d446214\\\"\"" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-72srw" podUID="bf1f37ea-a566-4dfd-b45b-02f284f19ce3" Mar 01 09:25:14 crc kubenswrapper[4792]: E0301 09:25:14.019353 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:114c0dee0bab1d453890e9dcc7727de749055bdbea049384d5696e7ac8d78fe3\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-55qzx" podUID="cd83ed19-023d-43c2-92db-d290499db3d4" Mar 01 09:25:14 crc kubenswrapper[4792]: I0301 09:25:14.066294 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-29ht5" Mar 01 09:25:14 crc kubenswrapper[4792]: I0301 09:25:14.152133 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15fa5cd2-57f2-4589-9947-c4a227fa68b6-utilities\") pod \"15fa5cd2-57f2-4589-9947-c4a227fa68b6\" (UID: \"15fa5cd2-57f2-4589-9947-c4a227fa68b6\") " Mar 01 09:25:14 crc kubenswrapper[4792]: I0301 09:25:14.152251 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtpbt\" (UniqueName: \"kubernetes.io/projected/15fa5cd2-57f2-4589-9947-c4a227fa68b6-kube-api-access-dtpbt\") pod \"15fa5cd2-57f2-4589-9947-c4a227fa68b6\" (UID: \"15fa5cd2-57f2-4589-9947-c4a227fa68b6\") " Mar 01 09:25:14 crc kubenswrapper[4792]: I0301 09:25:14.152317 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15fa5cd2-57f2-4589-9947-c4a227fa68b6-catalog-content\") pod \"15fa5cd2-57f2-4589-9947-c4a227fa68b6\" (UID: \"15fa5cd2-57f2-4589-9947-c4a227fa68b6\") " Mar 01 09:25:14 crc kubenswrapper[4792]: I0301 09:25:14.153883 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15fa5cd2-57f2-4589-9947-c4a227fa68b6-utilities" (OuterVolumeSpecName: "utilities") pod "15fa5cd2-57f2-4589-9947-c4a227fa68b6" (UID: "15fa5cd2-57f2-4589-9947-c4a227fa68b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:25:14 crc kubenswrapper[4792]: I0301 09:25:14.158190 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15fa5cd2-57f2-4589-9947-c4a227fa68b6-kube-api-access-dtpbt" (OuterVolumeSpecName: "kube-api-access-dtpbt") pod "15fa5cd2-57f2-4589-9947-c4a227fa68b6" (UID: "15fa5cd2-57f2-4589-9947-c4a227fa68b6"). InnerVolumeSpecName "kube-api-access-dtpbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:25:14 crc kubenswrapper[4792]: I0301 09:25:14.183328 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15fa5cd2-57f2-4589-9947-c4a227fa68b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "15fa5cd2-57f2-4589-9947-c4a227fa68b6" (UID: "15fa5cd2-57f2-4589-9947-c4a227fa68b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:25:14 crc kubenswrapper[4792]: I0301 09:25:14.254384 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15fa5cd2-57f2-4589-9947-c4a227fa68b6-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:25:14 crc kubenswrapper[4792]: I0301 09:25:14.254415 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtpbt\" (UniqueName: \"kubernetes.io/projected/15fa5cd2-57f2-4589-9947-c4a227fa68b6-kube-api-access-dtpbt\") on node \"crc\" DevicePath \"\"" Mar 01 09:25:14 crc kubenswrapper[4792]: I0301 09:25:14.254426 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15fa5cd2-57f2-4589-9947-c4a227fa68b6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:25:14 crc kubenswrapper[4792]: E0301 09:25:14.659202 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:172f24bd4603ac3498536a8a2c8fffb07cf9113dd52bc132778ea0aa275c6b84" Mar 01 09:25:14 crc kubenswrapper[4792]: E0301 09:25:14.659381 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:172f24bd4603ac3498536a8a2c8fffb07cf9113dd52bc132778ea0aa275c6b84,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pcdk4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-74b6b5dc96-knk7m_openstack-operators(8307ba19-5fc4-4cfc-b3cd-cafe5eac9cb9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:25:14 crc kubenswrapper[4792]: E0301 09:25:14.660699 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-knk7m" podUID="8307ba19-5fc4-4cfc-b3cd-cafe5eac9cb9" Mar 01 09:25:15 crc kubenswrapper[4792]: I0301 09:25:15.028401 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-29ht5" Mar 01 09:25:15 crc kubenswrapper[4792]: E0301 09:25:15.029164 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:172f24bd4603ac3498536a8a2c8fffb07cf9113dd52bc132778ea0aa275c6b84\\\"\"" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-knk7m" podUID="8307ba19-5fc4-4cfc-b3cd-cafe5eac9cb9" Mar 01 09:25:15 crc kubenswrapper[4792]: I0301 09:25:15.070357 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-29ht5"] Mar 01 09:25:15 crc kubenswrapper[4792]: I0301 09:25:15.078686 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-29ht5"] Mar 01 09:25:15 crc kubenswrapper[4792]: E0301 09:25:15.262422 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97" Mar 01 09:25:15 crc kubenswrapper[4792]: E0301 09:25:15.262581 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7jwgz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-bccc79885-64lkf_openstack-operators(e45ebab9-87d5-4b2f-b3d1-f1832864584d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:25:15 crc kubenswrapper[4792]: E0301 09:25:15.264488 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-64lkf" podUID="e45ebab9-87d5-4b2f-b3d1-f1832864584d" Mar 01 09:25:15 crc kubenswrapper[4792]: I0301 09:25:15.417259 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15fa5cd2-57f2-4589-9947-c4a227fa68b6" path="/var/lib/kubelet/pods/15fa5cd2-57f2-4589-9947-c4a227fa68b6/volumes" Mar 01 09:25:15 crc kubenswrapper[4792]: E0301 09:25:15.748531 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c" Mar 01 09:25:15 crc kubenswrapper[4792]: E0301 09:25:15.748700 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sxwh7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7c789f89c6-wjf62_openstack-operators(234d2ae5-7589-44cc-83f4-b0ee8a91940a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:25:15 crc kubenswrapper[4792]: E0301 09:25:15.749884 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wjf62" podUID="234d2ae5-7589-44cc-83f4-b0ee8a91940a" Mar 01 09:25:16 crc kubenswrapper[4792]: E0301 09:25:16.036510 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wjf62" podUID="234d2ae5-7589-44cc-83f4-b0ee8a91940a" Mar 01 09:25:16 crc kubenswrapper[4792]: E0301 09:25:16.289274 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e" Mar 01 09:25:16 crc kubenswrapper[4792]: E0301 09:25:16.289447 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fct2d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-648564c9fc-jdn6k_openstack-operators(808b8753-0a20-419b-8b04-dcbccaa2d77e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:25:16 crc kubenswrapper[4792]: E0301 09:25:16.290623 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jdn6k" podUID="808b8753-0a20-419b-8b04-dcbccaa2d77e" Mar 01 09:25:16 crc kubenswrapper[4792]: E0301 09:25:16.725765 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Mar 01 09:25:16 crc kubenswrapper[4792]: E0301 09:25:16.726054 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nsppl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-r5l9m_openstack-operators(1ecd6b07-eda9-41d6-90af-6471699ff808): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:25:16 crc kubenswrapper[4792]: E0301 09:25:16.728887 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5l9m" podUID="1ecd6b07-eda9-41d6-90af-6471699ff808" Mar 01 09:25:17 crc kubenswrapper[4792]: I0301 09:25:17.045810 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv"] Mar 01 09:25:17 crc kubenswrapper[4792]: I0301 09:25:17.048684 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-qjqd2" event={"ID":"dfb10d33-c4f1-4287-be83-dff835c733ba","Type":"ContainerStarted","Data":"7f3a04247b646b10567c07c9e3e71548969040e3e9071182a3e180e0de1ed7d2"} Mar 01 09:25:17 crc kubenswrapper[4792]: W0301 09:25:17.058880 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9244686e_175e_45f9_9eb7_23621cd1f3cd.slice/crio-3f6926cb229a8c54312c01ad1c5c76705cfa50a019a881ae8c04e230a3bdbb3b WatchSource:0}: Error finding container 3f6926cb229a8c54312c01ad1c5c76705cfa50a019a881ae8c04e230a3bdbb3b: Status 404 returned error can't find the container with id 3f6926cb229a8c54312c01ad1c5c76705cfa50a019a881ae8c04e230a3bdbb3b Mar 01 09:25:17 crc kubenswrapper[4792]: I0301 09:25:17.062593 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-54688575f-qjqd2" Mar 01 09:25:17 crc kubenswrapper[4792]: E0301 09:25:17.062752 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5l9m" podUID="1ecd6b07-eda9-41d6-90af-6471699ff808" Mar 01 09:25:17 crc kubenswrapper[4792]: I0301 09:25:17.090590 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-54688575f-qjqd2" podStartSLOduration=7.142139149 podStartE2EDuration="35.090568357s" podCreationTimestamp="2026-03-01 09:24:42 +0000 UTC" firstStartedPulling="2026-03-01 09:24:46.733623252 +0000 UTC m=+1015.975502449" lastFinishedPulling="2026-03-01 09:25:14.68205242 +0000 UTC m=+1043.923931657" observedRunningTime="2026-03-01 09:25:17.084252647 +0000 UTC m=+1046.326131844" watchObservedRunningTime="2026-03-01 09:25:17.090568357 +0000 UTC m=+1046.332447554" Mar 01 09:25:17 crc kubenswrapper[4792]: I0301 09:25:17.335862 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf"] Mar 01 09:25:17 crc kubenswrapper[4792]: W0301 09:25:17.339408 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea6739c2_185a_43e7_8fcf_0b2ae31957a0.slice/crio-f96a68fcaf233f289334a7086cece4460f0c737f53e6d3344ee4df9e6fddae97 WatchSource:0}: Error finding container f96a68fcaf233f289334a7086cece4460f0c737f53e6d3344ee4df9e6fddae97: Status 404 returned error can't find the container with id f96a68fcaf233f289334a7086cece4460f0c737f53e6d3344ee4df9e6fddae97 Mar 01 09:25:17 crc kubenswrapper[4792]: I0301 09:25:17.340861 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-76rvg"] Mar 01 09:25:17 crc kubenswrapper[4792]: I0301 09:25:17.369588 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx"] Mar 01 09:25:17 crc kubenswrapper[4792]: W0301 09:25:17.387196 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1d3783f_78e9_461a_916a_5a46e3083e70.slice/crio-4015b7338d1f84c58b14a44bada706d5334972f3a577cbe5b4d43f140d3fbc28 WatchSource:0}: Error finding container 4015b7338d1f84c58b14a44bada706d5334972f3a577cbe5b4d43f140d3fbc28: Status 404 returned error can't find the container with id 4015b7338d1f84c58b14a44bada706d5334972f3a577cbe5b4d43f140d3fbc28 Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.055188 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-7v65r" event={"ID":"5044cf86-f557-41d4-b6c0-a41a668ac999","Type":"ContainerStarted","Data":"ad7ac441d6a6d297f23e4f60f71aaf15be42b6ef9a3f77c6078baffb9583af3d"} Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.056529 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-7v65r" Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.058296 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf" event={"ID":"ea6739c2-185a-43e7-8fcf-0b2ae31957a0","Type":"ContainerStarted","Data":"f96a68fcaf233f289334a7086cece4460f0c737f53e6d3344ee4df9e6fddae97"} Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.059113 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv" event={"ID":"9244686e-175e-45f9-9eb7-23621cd1f3cd","Type":"ContainerStarted","Data":"3f6926cb229a8c54312c01ad1c5c76705cfa50a019a881ae8c04e230a3bdbb3b"} Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.064151 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-zkx7c" event={"ID":"3d38195c-e4ff-49cf-9592-e9f52d73f2df","Type":"ContainerStarted","Data":"d9878c65a4fd17c28fde0a37cca1809fdeaa343383582c866c265cc65d15bcfa"} Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.064740 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-zkx7c" Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.065768 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" event={"ID":"d1d3783f-78e9-461a-916a-5a46e3083e70","Type":"ContainerStarted","Data":"4787f8f2d253ff4dd1d652823910ffe40b98fc88f9619b345142107028aa83f6"} Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.065795 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" event={"ID":"d1d3783f-78e9-461a-916a-5a46e3083e70","Type":"ContainerStarted","Data":"4015b7338d1f84c58b14a44bada706d5334972f3a577cbe5b4d43f140d3fbc28"} Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.066278 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.067358 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jlnsb" event={"ID":"8741a141-0194-4eb2-956e-c41f4ffe1338","Type":"ContainerStarted","Data":"ea5bb485455205d13e998f2d92ddff8adb9c1ab3676579be65add75470498c08"} Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.067811 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jlnsb" Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.069058 4792 generic.go:334] "Generic (PLEG): container finished" podID="5ee54ba4-1afe-492b-a35b-23f0da447772" containerID="832abdde1bd9df0dee6fd890cd41053134638610acc4bc9948b5dd0abc79f243" exitCode=0 Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.069110 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-76rvg" event={"ID":"5ee54ba4-1afe-492b-a35b-23f0da447772","Type":"ContainerDied","Data":"832abdde1bd9df0dee6fd890cd41053134638610acc4bc9948b5dd0abc79f243"} Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.069129 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-76rvg" event={"ID":"5ee54ba4-1afe-492b-a35b-23f0da447772","Type":"ContainerStarted","Data":"6f39b288ec76c37d95cc09cbb8367cea58bf1050a4c16d38edfc803ed8d4b5b8"} Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.073428 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-jvw5j" event={"ID":"2af4993f-9ba9-4f7a-a31e-2bd133a7d4c5","Type":"ContainerStarted","Data":"d22ef362e5f7c3b60d8a91d8c51b4543bfe8b25dee50b04edef1cacaa3f86986"} Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.073515 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-jvw5j" Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.075374 4792 generic.go:334] "Generic (PLEG): container finished" podID="02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a" containerID="7d843e7f8ed1891003a854d8c5e8a92905cc32dc6fd52496f440d3191a085277" exitCode=0 Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.075413 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgfjs" event={"ID":"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a","Type":"ContainerDied","Data":"7d843e7f8ed1891003a854d8c5e8a92905cc32dc6fd52496f440d3191a085277"} Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.115404 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-zkx7c" podStartSLOduration=7.508994189 podStartE2EDuration="35.11538325s" podCreationTimestamp="2026-03-01 09:24:43 +0000 UTC" firstStartedPulling="2026-03-01 09:24:47.054362381 +0000 UTC m=+1016.296241578" lastFinishedPulling="2026-03-01 09:25:14.660751442 +0000 UTC m=+1043.902630639" observedRunningTime="2026-03-01 09:25:18.115093393 +0000 UTC m=+1047.356972590" watchObservedRunningTime="2026-03-01 09:25:18.11538325 +0000 UTC m=+1047.357262457" Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.119345 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-7v65r" podStartSLOduration=7.643419735 podStartE2EDuration="36.119331184s" podCreationTimestamp="2026-03-01 09:24:42 +0000 UTC" firstStartedPulling="2026-03-01 09:24:46.206177622 +0000 UTC m=+1015.448056819" lastFinishedPulling="2026-03-01 09:25:14.682089071 +0000 UTC m=+1043.923968268" observedRunningTime="2026-03-01 09:25:18.089724138 +0000 UTC m=+1047.331603345" watchObservedRunningTime="2026-03-01 09:25:18.119331184 +0000 UTC m=+1047.361210381" Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.138510 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jlnsb" podStartSLOduration=6.049465239 podStartE2EDuration="36.13848575s" podCreationTimestamp="2026-03-01 09:24:42 +0000 UTC" firstStartedPulling="2026-03-01 09:24:44.57169961 +0000 UTC m=+1013.813578817" lastFinishedPulling="2026-03-01 09:25:14.660720101 +0000 UTC m=+1043.902599328" observedRunningTime="2026-03-01 09:25:18.133718667 +0000 UTC m=+1047.375597864" watchObservedRunningTime="2026-03-01 09:25:18.13848575 +0000 UTC m=+1047.380364947" Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.209346 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" podStartSLOduration=35.209321838 podStartE2EDuration="35.209321838s" podCreationTimestamp="2026-03-01 09:24:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:25:18.199783421 +0000 UTC m=+1047.441662618" watchObservedRunningTime="2026-03-01 09:25:18.209321838 +0000 UTC m=+1047.451201045" Mar 01 09:25:18 crc kubenswrapper[4792]: I0301 09:25:18.236201 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-jvw5j" podStartSLOduration=6.573772833 podStartE2EDuration="36.236184638s" podCreationTimestamp="2026-03-01 09:24:42 +0000 UTC" firstStartedPulling="2026-03-01 09:24:46.61697233 +0000 UTC m=+1015.858851527" lastFinishedPulling="2026-03-01 09:25:16.279384135 +0000 UTC m=+1045.521263332" observedRunningTime="2026-03-01 09:25:18.22868142 +0000 UTC m=+1047.470560617" watchObservedRunningTime="2026-03-01 09:25:18.236184638 +0000 UTC m=+1047.478063835" Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.119978 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-jpxwz" event={"ID":"4fe8270e-a46d-40bc-8d24-a4585b196f5e","Type":"ContainerStarted","Data":"6892640fbb705e28b0c71c2706ad76c2d091ae6f2dd03c538b97f78bac1cc741"} Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.125521 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-jpxwz" Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.134252 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-mqndr" event={"ID":"e0cef8e2-a392-4612-97c6-17c611b2a44e","Type":"ContainerStarted","Data":"3b2391a3b24c4743bb3f6633537111671e24e6bbc2f99b760acdcabb3d972ce9"} Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.137234 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-bcnns" event={"ID":"2970c60c-7b03-4667-99e4-08c094cdbfc2","Type":"ContainerStarted","Data":"aa90c8aa701a9cf03433d8e5e5defb7d1475d444068902178946a74477ab2a0f"} Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.137578 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-bcnns" Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.143651 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgfjs" event={"ID":"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a","Type":"ContainerStarted","Data":"888c652da8c6af91af7d18e82384bcef82d7b06f24eb6e5b62926e1135a94b10"} Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.146089 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-54rpl" event={"ID":"ecc17c18-7695-4d22-9a95-bcac51800d60","Type":"ContainerStarted","Data":"0f09ae0e09ec4ee1981076cf0062bdb867f23798fa0a4b442b4fe351e4f18779"} Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.146360 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-54rpl" Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.149470 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-76rvg" event={"ID":"5ee54ba4-1afe-492b-a35b-23f0da447772","Type":"ContainerStarted","Data":"d56625efa0b27d9e3673d2b751ede19c1c4b198dcd6670a8ed75741872c3784a"} Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.167656 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-jpxwz" podStartSLOduration=4.970947146 podStartE2EDuration="39.167634479s" podCreationTimestamp="2026-03-01 09:24:43 +0000 UTC" firstStartedPulling="2026-03-01 09:24:47.168784099 +0000 UTC m=+1016.410663296" lastFinishedPulling="2026-03-01 09:25:21.365471432 +0000 UTC m=+1050.607350629" observedRunningTime="2026-03-01 09:25:22.145530892 +0000 UTC m=+1051.387410099" watchObservedRunningTime="2026-03-01 09:25:22.167634479 +0000 UTC m=+1051.409513676" Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.180969 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf" event={"ID":"ea6739c2-185a-43e7-8fcf-0b2ae31957a0","Type":"ContainerStarted","Data":"f9736d0776a1969518aa8cf69ea66d57fc40ece1f9479a0b2732c5251133a5e1"} Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.181593 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf" Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.203273 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vgfjs" podStartSLOduration=6.772570505 podStartE2EDuration="34.203253907s" podCreationTimestamp="2026-03-01 09:24:48 +0000 UTC" firstStartedPulling="2026-03-01 09:24:53.237570305 +0000 UTC m=+1022.479449512" lastFinishedPulling="2026-03-01 09:25:20.668253717 +0000 UTC m=+1049.910132914" observedRunningTime="2026-03-01 09:25:22.182460041 +0000 UTC m=+1051.424339238" watchObservedRunningTime="2026-03-01 09:25:22.203253907 +0000 UTC m=+1051.445133104" Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.208270 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv" event={"ID":"9244686e-175e-45f9-9eb7-23621cd1f3cd","Type":"ContainerStarted","Data":"f62adb87846062dc897503d4685420a6831398730f723a3a8bffe44bda9273dc"} Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.209105 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv" Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.210672 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-bcnns" podStartSLOduration=6.591924972 podStartE2EDuration="39.210660354s" podCreationTimestamp="2026-03-01 09:24:43 +0000 UTC" firstStartedPulling="2026-03-01 09:24:47.121919149 +0000 UTC m=+1016.363798346" lastFinishedPulling="2026-03-01 09:25:19.740654511 +0000 UTC m=+1048.982533728" observedRunningTime="2026-03-01 09:25:22.208967573 +0000 UTC m=+1051.450846770" watchObservedRunningTime="2026-03-01 09:25:22.210660354 +0000 UTC m=+1051.452539551" Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.244863 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv" podStartSLOduration=34.954972636 podStartE2EDuration="39.244842418s" podCreationTimestamp="2026-03-01 09:24:43 +0000 UTC" firstStartedPulling="2026-03-01 09:25:17.0768495 +0000 UTC m=+1046.318728697" lastFinishedPulling="2026-03-01 09:25:21.366719282 +0000 UTC m=+1050.608598479" observedRunningTime="2026-03-01 09:25:22.238083357 +0000 UTC m=+1051.479962574" watchObservedRunningTime="2026-03-01 09:25:22.244842418 +0000 UTC m=+1051.486721615" Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.289866 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf" podStartSLOduration=36.267530955 podStartE2EDuration="40.289848611s" podCreationTimestamp="2026-03-01 09:24:42 +0000 UTC" firstStartedPulling="2026-03-01 09:25:17.342090811 +0000 UTC m=+1046.583969998" lastFinishedPulling="2026-03-01 09:25:21.364408457 +0000 UTC m=+1050.606287654" observedRunningTime="2026-03-01 09:25:22.28479562 +0000 UTC m=+1051.526674817" watchObservedRunningTime="2026-03-01 09:25:22.289848611 +0000 UTC m=+1051.531727808" Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.311547 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-54rpl" podStartSLOduration=6.51000262 podStartE2EDuration="39.311528747s" podCreationTimestamp="2026-03-01 09:24:43 +0000 UTC" firstStartedPulling="2026-03-01 09:24:46.940123778 +0000 UTC m=+1016.182002975" lastFinishedPulling="2026-03-01 09:25:19.741649895 +0000 UTC m=+1048.983529102" observedRunningTime="2026-03-01 09:25:22.306897067 +0000 UTC m=+1051.548776264" watchObservedRunningTime="2026-03-01 09:25:22.311528747 +0000 UTC m=+1051.553407944" Mar 01 09:25:22 crc kubenswrapper[4792]: I0301 09:25:22.898247 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jlnsb" Mar 01 09:25:23 crc kubenswrapper[4792]: I0301 09:25:23.073095 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-7v65r" Mar 01 09:25:23 crc kubenswrapper[4792]: I0301 09:25:23.218202 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-hlzm6" event={"ID":"1793465e-1273-4250-a238-c99798788618","Type":"ContainerStarted","Data":"1babc5a330340ac6903322191b57ac6f28537095bd94a297360a9b3ffade03eb"} Mar 01 09:25:23 crc kubenswrapper[4792]: I0301 09:25:23.218516 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-jvw5j" Mar 01 09:25:23 crc kubenswrapper[4792]: I0301 09:25:23.219153 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-hlzm6" Mar 01 09:25:23 crc kubenswrapper[4792]: I0301 09:25:23.220976 4792 generic.go:334] "Generic (PLEG): container finished" podID="5ee54ba4-1afe-492b-a35b-23f0da447772" containerID="d56625efa0b27d9e3673d2b751ede19c1c4b198dcd6670a8ed75741872c3784a" exitCode=0 Mar 01 09:25:23 crc kubenswrapper[4792]: I0301 09:25:23.221346 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-76rvg" event={"ID":"5ee54ba4-1afe-492b-a35b-23f0da447772","Type":"ContainerDied","Data":"d56625efa0b27d9e3673d2b751ede19c1c4b198dcd6670a8ed75741872c3784a"} Mar 01 09:25:23 crc kubenswrapper[4792]: I0301 09:25:23.222783 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-mqndr" Mar 01 09:25:23 crc kubenswrapper[4792]: I0301 09:25:23.238469 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-hlzm6" podStartSLOduration=6.208555556 podStartE2EDuration="41.238448387s" podCreationTimestamp="2026-03-01 09:24:42 +0000 UTC" firstStartedPulling="2026-03-01 09:24:46.826102031 +0000 UTC m=+1016.067981228" lastFinishedPulling="2026-03-01 09:25:21.855994862 +0000 UTC m=+1051.097874059" observedRunningTime="2026-03-01 09:25:23.237226028 +0000 UTC m=+1052.479105225" watchObservedRunningTime="2026-03-01 09:25:23.238448387 +0000 UTC m=+1052.480327584" Mar 01 09:25:23 crc kubenswrapper[4792]: I0301 09:25:23.273638 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-mqndr" podStartSLOduration=5.542889423 podStartE2EDuration="40.273615835s" podCreationTimestamp="2026-03-01 09:24:43 +0000 UTC" firstStartedPulling="2026-03-01 09:24:47.124212535 +0000 UTC m=+1016.366091732" lastFinishedPulling="2026-03-01 09:25:21.854938947 +0000 UTC m=+1051.096818144" observedRunningTime="2026-03-01 09:25:23.267606502 +0000 UTC m=+1052.509485699" watchObservedRunningTime="2026-03-01 09:25:23.273615835 +0000 UTC m=+1052.515495032" Mar 01 09:25:23 crc kubenswrapper[4792]: I0301 09:25:23.362382 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-54688575f-qjqd2" Mar 01 09:25:23 crc kubenswrapper[4792]: I0301 09:25:23.777352 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-zkx7c" Mar 01 09:25:27 crc kubenswrapper[4792]: E0301 09:25:27.557023 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e\\\"\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jdn6k" podUID="808b8753-0a20-419b-8b04-dcbccaa2d77e" Mar 01 09:25:27 crc kubenswrapper[4792]: E0301 09:25:27.571100 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-64lkf" podUID="e45ebab9-87d5-4b2f-b3d1-f1832864584d" Mar 01 09:25:28 crc kubenswrapper[4792]: I0301 09:25:28.253779 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-76rvg" event={"ID":"5ee54ba4-1afe-492b-a35b-23f0da447772","Type":"ContainerStarted","Data":"0be24cd4ccbbe75283901f8e9a66a85ecb2a776492e33a934b9d2c63f6f16453"} Mar 01 09:25:28 crc kubenswrapper[4792]: I0301 09:25:28.255617 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9wzbh" event={"ID":"02dd5cc0-c44b-4ede-972b-9d26c9c54100","Type":"ContainerStarted","Data":"1f0fdd6f611b2cf63cb513a2021cdbfb6a0e69f83021884f9e2a5716c7dfdb7d"} Mar 01 09:25:28 crc kubenswrapper[4792]: I0301 09:25:28.255976 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9wzbh" Mar 01 09:25:28 crc kubenswrapper[4792]: I0301 09:25:28.271959 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-76rvg" podStartSLOduration=27.769899138 podStartE2EDuration="37.271940911s" podCreationTimestamp="2026-03-01 09:24:51 +0000 UTC" firstStartedPulling="2026-03-01 09:25:18.070166792 +0000 UTC m=+1047.312045989" lastFinishedPulling="2026-03-01 09:25:27.572208565 +0000 UTC m=+1056.814087762" observedRunningTime="2026-03-01 09:25:28.270319342 +0000 UTC m=+1057.512198529" watchObservedRunningTime="2026-03-01 09:25:28.271940911 +0000 UTC m=+1057.513820108" Mar 01 09:25:28 crc kubenswrapper[4792]: I0301 09:25:28.307269 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9wzbh" podStartSLOduration=5.340439285 podStartE2EDuration="46.307240502s" podCreationTimestamp="2026-03-01 09:24:42 +0000 UTC" firstStartedPulling="2026-03-01 09:24:46.651014585 +0000 UTC m=+1015.892893782" lastFinishedPulling="2026-03-01 09:25:27.617815802 +0000 UTC m=+1056.859694999" observedRunningTime="2026-03-01 09:25:28.299683392 +0000 UTC m=+1057.541562609" watchObservedRunningTime="2026-03-01 09:25:28.307240502 +0000 UTC m=+1057.549119699" Mar 01 09:25:28 crc kubenswrapper[4792]: I0301 09:25:28.774554 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-dsqtf" Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.206164 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vgfjs" Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.206233 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vgfjs" Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.277611 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-72srw" event={"ID":"bf1f37ea-a566-4dfd-b45b-02f284f19ce3","Type":"ContainerStarted","Data":"560bd85111d581b27b39be06d4086bec168b864f989e9133615e7a97f6221eb9"} Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.278878 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-72srw" Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.282857 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-ggspg" event={"ID":"b9e3fd6b-e3e2-4380-b8d7-900891df562a","Type":"ContainerStarted","Data":"45614005e3c7ecd0bab6ebc1c2f744a57f9d4ee3751d0a1f01582d0ad0b5f387"} Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.283158 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-ggspg" Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.287419 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5l9m" event={"ID":"1ecd6b07-eda9-41d6-90af-6471699ff808","Type":"ContainerStarted","Data":"80541088e8c24ff4ba3c18c56556e60eda017df94f18130d34db13224329f773"} Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.295213 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-knk7m" event={"ID":"8307ba19-5fc4-4cfc-b3cd-cafe5eac9cb9","Type":"ContainerStarted","Data":"8737006219f5372a30ad61237d82618383e87d9283f6ad5f49b2991c2224b97f"} Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.295741 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-knk7m" Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.297553 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wjf62" event={"ID":"234d2ae5-7589-44cc-83f4-b0ee8a91940a","Type":"ContainerStarted","Data":"172e116d6c1904833ece964abe39b26379c590a14618bb39e7bb55e8277d5f6a"} Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.297939 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wjf62" Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.299484 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-t5fsn" event={"ID":"376afe52-646d-44b7-b32e-ce6cd6dc21a6","Type":"ContainerStarted","Data":"5e3956812ed98050d4a71d8b39110c43a71dfa78c7f35697b9c16752b4a6a549"} Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.299810 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-67d996989d-t5fsn" Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.343810 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5l9m" podStartSLOduration=4.600301953 podStartE2EDuration="46.343794263s" podCreationTimestamp="2026-03-01 09:24:43 +0000 UTC" firstStartedPulling="2026-03-01 09:24:47.242869266 +0000 UTC m=+1016.484748463" lastFinishedPulling="2026-03-01 09:25:28.986361576 +0000 UTC m=+1058.228240773" observedRunningTime="2026-03-01 09:25:29.338535838 +0000 UTC m=+1058.580415035" watchObservedRunningTime="2026-03-01 09:25:29.343794263 +0000 UTC m=+1058.585673460" Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.344212 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-72srw" podStartSLOduration=3.360604743 podStartE2EDuration="47.344206783s" podCreationTimestamp="2026-03-01 09:24:42 +0000 UTC" firstStartedPulling="2026-03-01 09:24:44.02145788 +0000 UTC m=+1013.263337077" lastFinishedPulling="2026-03-01 09:25:28.0050599 +0000 UTC m=+1057.246939117" observedRunningTime="2026-03-01 09:25:29.32136155 +0000 UTC m=+1058.563240747" watchObservedRunningTime="2026-03-01 09:25:29.344206783 +0000 UTC m=+1058.586085980" Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.372922 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wjf62" podStartSLOduration=6.206345242 podStartE2EDuration="47.372889477s" podCreationTimestamp="2026-03-01 09:24:42 +0000 UTC" firstStartedPulling="2026-03-01 09:24:46.795863279 +0000 UTC m=+1016.037742476" lastFinishedPulling="2026-03-01 09:25:27.962407494 +0000 UTC m=+1057.204286711" observedRunningTime="2026-03-01 09:25:29.365752197 +0000 UTC m=+1058.607631394" watchObservedRunningTime="2026-03-01 09:25:29.372889477 +0000 UTC m=+1058.614768674" Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.394105 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-67d996989d-t5fsn" podStartSLOduration=5.220785248 podStartE2EDuration="47.394088622s" podCreationTimestamp="2026-03-01 09:24:42 +0000 UTC" firstStartedPulling="2026-03-01 09:24:46.686556627 +0000 UTC m=+1015.928435824" lastFinishedPulling="2026-03-01 09:25:28.859860001 +0000 UTC m=+1058.101739198" observedRunningTime="2026-03-01 09:25:29.393630641 +0000 UTC m=+1058.635509838" watchObservedRunningTime="2026-03-01 09:25:29.394088622 +0000 UTC m=+1058.635967819" Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.424201 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-knk7m" podStartSLOduration=5.111613239 podStartE2EDuration="47.424181629s" podCreationTimestamp="2026-03-01 09:24:42 +0000 UTC" firstStartedPulling="2026-03-01 09:24:46.676266255 +0000 UTC m=+1015.918145452" lastFinishedPulling="2026-03-01 09:25:28.988834645 +0000 UTC m=+1058.230713842" observedRunningTime="2026-03-01 09:25:29.416773612 +0000 UTC m=+1058.658652809" watchObservedRunningTime="2026-03-01 09:25:29.424181629 +0000 UTC m=+1058.666060826" Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.435116 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-ggspg" podStartSLOduration=4.058551011 podStartE2EDuration="47.435098089s" podCreationTimestamp="2026-03-01 09:24:42 +0000 UTC" firstStartedPulling="2026-03-01 09:24:44.584426002 +0000 UTC m=+1013.826305199" lastFinishedPulling="2026-03-01 09:25:27.96097308 +0000 UTC m=+1057.202852277" observedRunningTime="2026-03-01 09:25:29.429581558 +0000 UTC m=+1058.671460755" watchObservedRunningTime="2026-03-01 09:25:29.435098089 +0000 UTC m=+1058.676977286" Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.528345 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b4cc4776948grv" Mar 01 09:25:29 crc kubenswrapper[4792]: I0301 09:25:29.965576 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-864b865b94-5ndlx" Mar 01 09:25:30 crc kubenswrapper[4792]: I0301 09:25:30.314392 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-55qzx" event={"ID":"cd83ed19-023d-43c2-92db-d290499db3d4","Type":"ContainerStarted","Data":"23bd907d7303eab6c38e1e32812fcbbfb88d20a5a6e1bf2c2d7e1863cd0a2ccb"} Mar 01 09:25:30 crc kubenswrapper[4792]: I0301 09:25:30.339446 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-55qzx" podStartSLOduration=4.776126929 podStartE2EDuration="48.3394304s" podCreationTimestamp="2026-03-01 09:24:42 +0000 UTC" firstStartedPulling="2026-03-01 09:24:46.241871048 +0000 UTC m=+1015.483750245" lastFinishedPulling="2026-03-01 09:25:29.805174519 +0000 UTC m=+1059.047053716" observedRunningTime="2026-03-01 09:25:30.334821941 +0000 UTC m=+1059.576701138" watchObservedRunningTime="2026-03-01 09:25:30.3394304 +0000 UTC m=+1059.581309597" Mar 01 09:25:30 crc kubenswrapper[4792]: I0301 09:25:30.413399 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-vgfjs" podUID="02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a" containerName="registry-server" probeResult="failure" output=< Mar 01 09:25:30 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 09:25:30 crc kubenswrapper[4792]: > Mar 01 09:25:31 crc kubenswrapper[4792]: I0301 09:25:31.872578 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-76rvg" Mar 01 09:25:31 crc kubenswrapper[4792]: I0301 09:25:31.874244 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-76rvg" Mar 01 09:25:32 crc kubenswrapper[4792]: I0301 09:25:32.914351 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-76rvg" podUID="5ee54ba4-1afe-492b-a35b-23f0da447772" containerName="registry-server" probeResult="failure" output=< Mar 01 09:25:32 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 09:25:32 crc kubenswrapper[4792]: > Mar 01 09:25:32 crc kubenswrapper[4792]: I0301 09:25:32.969705 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-9wzbh" Mar 01 09:25:33 crc kubenswrapper[4792]: I0301 09:25:33.139536 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-55qzx" Mar 01 09:25:33 crc kubenswrapper[4792]: I0301 09:25:33.217776 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-wjf62" Mar 01 09:25:33 crc kubenswrapper[4792]: I0301 09:25:33.312668 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-hlzm6" Mar 01 09:25:33 crc kubenswrapper[4792]: I0301 09:25:33.653088 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-54rpl" Mar 01 09:25:33 crc kubenswrapper[4792]: I0301 09:25:33.953392 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-jpxwz" Mar 01 09:25:33 crc kubenswrapper[4792]: I0301 09:25:33.970975 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-mqndr" Mar 01 09:25:34 crc kubenswrapper[4792]: I0301 09:25:34.229441 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-bcnns" Mar 01 09:25:39 crc kubenswrapper[4792]: I0301 09:25:39.259600 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vgfjs" Mar 01 09:25:39 crc kubenswrapper[4792]: I0301 09:25:39.304092 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vgfjs" Mar 01 09:25:39 crc kubenswrapper[4792]: I0301 09:25:39.498605 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vgfjs"] Mar 01 09:25:40 crc kubenswrapper[4792]: I0301 09:25:40.372537 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vgfjs" podUID="02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a" containerName="registry-server" containerID="cri-o://888c652da8c6af91af7d18e82384bcef82d7b06f24eb6e5b62926e1135a94b10" gracePeriod=2 Mar 01 09:25:40 crc kubenswrapper[4792]: I0301 09:25:40.874364 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vgfjs" Mar 01 09:25:40 crc kubenswrapper[4792]: I0301 09:25:40.984184 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xztgl\" (UniqueName: \"kubernetes.io/projected/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a-kube-api-access-xztgl\") pod \"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a\" (UID: \"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a\") " Mar 01 09:25:40 crc kubenswrapper[4792]: I0301 09:25:40.984261 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a-catalog-content\") pod \"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a\" (UID: \"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a\") " Mar 01 09:25:40 crc kubenswrapper[4792]: I0301 09:25:40.984283 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a-utilities\") pod \"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a\" (UID: \"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a\") " Mar 01 09:25:40 crc kubenswrapper[4792]: I0301 09:25:40.985146 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a-utilities" (OuterVolumeSpecName: "utilities") pod "02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a" (UID: "02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:25:40 crc kubenswrapper[4792]: I0301 09:25:40.995241 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a-kube-api-access-xztgl" (OuterVolumeSpecName: "kube-api-access-xztgl") pod "02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a" (UID: "02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a"). InnerVolumeSpecName "kube-api-access-xztgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.042695 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a" (UID: "02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.085183 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xztgl\" (UniqueName: \"kubernetes.io/projected/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a-kube-api-access-xztgl\") on node \"crc\" DevicePath \"\"" Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.085370 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.085457 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.383239 4792 generic.go:334] "Generic (PLEG): container finished" podID="02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a" containerID="888c652da8c6af91af7d18e82384bcef82d7b06f24eb6e5b62926e1135a94b10" exitCode=0 Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.383279 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgfjs" event={"ID":"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a","Type":"ContainerDied","Data":"888c652da8c6af91af7d18e82384bcef82d7b06f24eb6e5b62926e1135a94b10"} Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.383314 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgfjs" event={"ID":"02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a","Type":"ContainerDied","Data":"07b2ccb12f444a169347668634bc26575de0ebccb8ee9dc035b529cef91259bc"} Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.383341 4792 scope.go:117] "RemoveContainer" containerID="888c652da8c6af91af7d18e82384bcef82d7b06f24eb6e5b62926e1135a94b10" Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.384709 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vgfjs" Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.425628 4792 scope.go:117] "RemoveContainer" containerID="7d843e7f8ed1891003a854d8c5e8a92905cc32dc6fd52496f440d3191a085277" Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.426885 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vgfjs"] Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.439560 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vgfjs"] Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.454492 4792 scope.go:117] "RemoveContainer" containerID="eab704b2ab894fb8cdec51e904b0f50429dd2cdba1e21487a963aeddb64f6526" Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.493432 4792 scope.go:117] "RemoveContainer" containerID="888c652da8c6af91af7d18e82384bcef82d7b06f24eb6e5b62926e1135a94b10" Mar 01 09:25:41 crc kubenswrapper[4792]: E0301 09:25:41.493822 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"888c652da8c6af91af7d18e82384bcef82d7b06f24eb6e5b62926e1135a94b10\": container with ID starting with 888c652da8c6af91af7d18e82384bcef82d7b06f24eb6e5b62926e1135a94b10 not found: ID does not exist" containerID="888c652da8c6af91af7d18e82384bcef82d7b06f24eb6e5b62926e1135a94b10" Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.493855 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"888c652da8c6af91af7d18e82384bcef82d7b06f24eb6e5b62926e1135a94b10"} err="failed to get container status \"888c652da8c6af91af7d18e82384bcef82d7b06f24eb6e5b62926e1135a94b10\": rpc error: code = NotFound desc = could not find container \"888c652da8c6af91af7d18e82384bcef82d7b06f24eb6e5b62926e1135a94b10\": container with ID starting with 888c652da8c6af91af7d18e82384bcef82d7b06f24eb6e5b62926e1135a94b10 not found: ID does not exist" Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.493880 4792 scope.go:117] "RemoveContainer" containerID="7d843e7f8ed1891003a854d8c5e8a92905cc32dc6fd52496f440d3191a085277" Mar 01 09:25:41 crc kubenswrapper[4792]: E0301 09:25:41.494181 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d843e7f8ed1891003a854d8c5e8a92905cc32dc6fd52496f440d3191a085277\": container with ID starting with 7d843e7f8ed1891003a854d8c5e8a92905cc32dc6fd52496f440d3191a085277 not found: ID does not exist" containerID="7d843e7f8ed1891003a854d8c5e8a92905cc32dc6fd52496f440d3191a085277" Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.494208 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d843e7f8ed1891003a854d8c5e8a92905cc32dc6fd52496f440d3191a085277"} err="failed to get container status \"7d843e7f8ed1891003a854d8c5e8a92905cc32dc6fd52496f440d3191a085277\": rpc error: code = NotFound desc = could not find container \"7d843e7f8ed1891003a854d8c5e8a92905cc32dc6fd52496f440d3191a085277\": container with ID starting with 7d843e7f8ed1891003a854d8c5e8a92905cc32dc6fd52496f440d3191a085277 not found: ID does not exist" Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.494225 4792 scope.go:117] "RemoveContainer" containerID="eab704b2ab894fb8cdec51e904b0f50429dd2cdba1e21487a963aeddb64f6526" Mar 01 09:25:41 crc kubenswrapper[4792]: E0301 09:25:41.494814 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eab704b2ab894fb8cdec51e904b0f50429dd2cdba1e21487a963aeddb64f6526\": container with ID starting with eab704b2ab894fb8cdec51e904b0f50429dd2cdba1e21487a963aeddb64f6526 not found: ID does not exist" containerID="eab704b2ab894fb8cdec51e904b0f50429dd2cdba1e21487a963aeddb64f6526" Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.494837 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eab704b2ab894fb8cdec51e904b0f50429dd2cdba1e21487a963aeddb64f6526"} err="failed to get container status \"eab704b2ab894fb8cdec51e904b0f50429dd2cdba1e21487a963aeddb64f6526\": rpc error: code = NotFound desc = could not find container \"eab704b2ab894fb8cdec51e904b0f50429dd2cdba1e21487a963aeddb64f6526\": container with ID starting with eab704b2ab894fb8cdec51e904b0f50429dd2cdba1e21487a963aeddb64f6526 not found: ID does not exist" Mar 01 09:25:41 crc kubenswrapper[4792]: I0301 09:25:41.957540 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-76rvg" Mar 01 09:25:42 crc kubenswrapper[4792]: I0301 09:25:42.023511 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-76rvg" Mar 01 09:25:42 crc kubenswrapper[4792]: I0301 09:25:42.390112 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jdn6k" event={"ID":"808b8753-0a20-419b-8b04-dcbccaa2d77e","Type":"ContainerStarted","Data":"3256fa7dea275757daf6c0806ce9bfa49ca3f09dc326c3af4e322bc3a564d5fb"} Mar 01 09:25:42 crc kubenswrapper[4792]: I0301 09:25:42.391074 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jdn6k" Mar 01 09:25:42 crc kubenswrapper[4792]: I0301 09:25:42.410574 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jdn6k" podStartSLOduration=4.784279243 podStartE2EDuration="59.410557608s" podCreationTimestamp="2026-03-01 09:24:43 +0000 UTC" firstStartedPulling="2026-03-01 09:24:47.27642797 +0000 UTC m=+1016.518307167" lastFinishedPulling="2026-03-01 09:25:41.902706335 +0000 UTC m=+1071.144585532" observedRunningTime="2026-03-01 09:25:42.405969639 +0000 UTC m=+1071.647848836" watchObservedRunningTime="2026-03-01 09:25:42.410557608 +0000 UTC m=+1071.652436815" Mar 01 09:25:42 crc kubenswrapper[4792]: I0301 09:25:42.880846 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-ggspg" Mar 01 09:25:42 crc kubenswrapper[4792]: I0301 09:25:42.952638 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-72srw" Mar 01 09:25:43 crc kubenswrapper[4792]: I0301 09:25:43.142657 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-55qzx" Mar 01 09:25:43 crc kubenswrapper[4792]: I0301 09:25:43.346456 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-67d996989d-t5fsn" Mar 01 09:25:43 crc kubenswrapper[4792]: I0301 09:25:43.399398 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-64lkf" event={"ID":"e45ebab9-87d5-4b2f-b3d1-f1832864584d","Type":"ContainerStarted","Data":"fc2c4132f6b6f61506c1aef72d0aaac00e147da7528cfb7453b927b244ba87d0"} Mar 01 09:25:43 crc kubenswrapper[4792]: I0301 09:25:43.399932 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-64lkf" Mar 01 09:25:43 crc kubenswrapper[4792]: I0301 09:25:43.420031 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a" path="/var/lib/kubelet/pods/02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a/volumes" Mar 01 09:25:43 crc kubenswrapper[4792]: I0301 09:25:43.421356 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-64lkf" podStartSLOduration=4.883327199 podStartE2EDuration="1m0.421339606s" podCreationTimestamp="2026-03-01 09:24:43 +0000 UTC" firstStartedPulling="2026-03-01 09:24:47.257359652 +0000 UTC m=+1016.499238849" lastFinishedPulling="2026-03-01 09:25:42.795372059 +0000 UTC m=+1072.037251256" observedRunningTime="2026-03-01 09:25:43.418465898 +0000 UTC m=+1072.660345095" watchObservedRunningTime="2026-03-01 09:25:43.421339606 +0000 UTC m=+1072.663218803" Mar 01 09:25:43 crc kubenswrapper[4792]: I0301 09:25:43.558587 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-knk7m" Mar 01 09:25:43 crc kubenswrapper[4792]: I0301 09:25:43.896888 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-76rvg"] Mar 01 09:25:43 crc kubenswrapper[4792]: I0301 09:25:43.897228 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-76rvg" podUID="5ee54ba4-1afe-492b-a35b-23f0da447772" containerName="registry-server" containerID="cri-o://0be24cd4ccbbe75283901f8e9a66a85ecb2a776492e33a934b9d2c63f6f16453" gracePeriod=2 Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.341534 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-76rvg" Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.407114 4792 generic.go:334] "Generic (PLEG): container finished" podID="5ee54ba4-1afe-492b-a35b-23f0da447772" containerID="0be24cd4ccbbe75283901f8e9a66a85ecb2a776492e33a934b9d2c63f6f16453" exitCode=0 Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.407767 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-76rvg" Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.408185 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-76rvg" event={"ID":"5ee54ba4-1afe-492b-a35b-23f0da447772","Type":"ContainerDied","Data":"0be24cd4ccbbe75283901f8e9a66a85ecb2a776492e33a934b9d2c63f6f16453"} Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.408214 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-76rvg" event={"ID":"5ee54ba4-1afe-492b-a35b-23f0da447772","Type":"ContainerDied","Data":"6f39b288ec76c37d95cc09cbb8367cea58bf1050a4c16d38edfc803ed8d4b5b8"} Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.408231 4792 scope.go:117] "RemoveContainer" containerID="0be24cd4ccbbe75283901f8e9a66a85ecb2a776492e33a934b9d2c63f6f16453" Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.426209 4792 scope.go:117] "RemoveContainer" containerID="d56625efa0b27d9e3673d2b751ede19c1c4b198dcd6670a8ed75741872c3784a" Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.452160 4792 scope.go:117] "RemoveContainer" containerID="832abdde1bd9df0dee6fd890cd41053134638610acc4bc9948b5dd0abc79f243" Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.452831 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ee54ba4-1afe-492b-a35b-23f0da447772-utilities\") pod \"5ee54ba4-1afe-492b-a35b-23f0da447772\" (UID: \"5ee54ba4-1afe-492b-a35b-23f0da447772\") " Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.452956 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ee54ba4-1afe-492b-a35b-23f0da447772-catalog-content\") pod \"5ee54ba4-1afe-492b-a35b-23f0da447772\" (UID: \"5ee54ba4-1afe-492b-a35b-23f0da447772\") " Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.452991 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4c9l7\" (UniqueName: \"kubernetes.io/projected/5ee54ba4-1afe-492b-a35b-23f0da447772-kube-api-access-4c9l7\") pod \"5ee54ba4-1afe-492b-a35b-23f0da447772\" (UID: \"5ee54ba4-1afe-492b-a35b-23f0da447772\") " Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.462041 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ee54ba4-1afe-492b-a35b-23f0da447772-kube-api-access-4c9l7" (OuterVolumeSpecName: "kube-api-access-4c9l7") pod "5ee54ba4-1afe-492b-a35b-23f0da447772" (UID: "5ee54ba4-1afe-492b-a35b-23f0da447772"). InnerVolumeSpecName "kube-api-access-4c9l7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.463001 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ee54ba4-1afe-492b-a35b-23f0da447772-utilities" (OuterVolumeSpecName: "utilities") pod "5ee54ba4-1afe-492b-a35b-23f0da447772" (UID: "5ee54ba4-1afe-492b-a35b-23f0da447772"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.476219 4792 scope.go:117] "RemoveContainer" containerID="0be24cd4ccbbe75283901f8e9a66a85ecb2a776492e33a934b9d2c63f6f16453" Mar 01 09:25:44 crc kubenswrapper[4792]: E0301 09:25:44.477244 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0be24cd4ccbbe75283901f8e9a66a85ecb2a776492e33a934b9d2c63f6f16453\": container with ID starting with 0be24cd4ccbbe75283901f8e9a66a85ecb2a776492e33a934b9d2c63f6f16453 not found: ID does not exist" containerID="0be24cd4ccbbe75283901f8e9a66a85ecb2a776492e33a934b9d2c63f6f16453" Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.477273 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0be24cd4ccbbe75283901f8e9a66a85ecb2a776492e33a934b9d2c63f6f16453"} err="failed to get container status \"0be24cd4ccbbe75283901f8e9a66a85ecb2a776492e33a934b9d2c63f6f16453\": rpc error: code = NotFound desc = could not find container \"0be24cd4ccbbe75283901f8e9a66a85ecb2a776492e33a934b9d2c63f6f16453\": container with ID starting with 0be24cd4ccbbe75283901f8e9a66a85ecb2a776492e33a934b9d2c63f6f16453 not found: ID does not exist" Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.477307 4792 scope.go:117] "RemoveContainer" containerID="d56625efa0b27d9e3673d2b751ede19c1c4b198dcd6670a8ed75741872c3784a" Mar 01 09:25:44 crc kubenswrapper[4792]: E0301 09:25:44.477978 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d56625efa0b27d9e3673d2b751ede19c1c4b198dcd6670a8ed75741872c3784a\": container with ID starting with d56625efa0b27d9e3673d2b751ede19c1c4b198dcd6670a8ed75741872c3784a not found: ID does not exist" containerID="d56625efa0b27d9e3673d2b751ede19c1c4b198dcd6670a8ed75741872c3784a" Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.478028 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d56625efa0b27d9e3673d2b751ede19c1c4b198dcd6670a8ed75741872c3784a"} err="failed to get container status \"d56625efa0b27d9e3673d2b751ede19c1c4b198dcd6670a8ed75741872c3784a\": rpc error: code = NotFound desc = could not find container \"d56625efa0b27d9e3673d2b751ede19c1c4b198dcd6670a8ed75741872c3784a\": container with ID starting with d56625efa0b27d9e3673d2b751ede19c1c4b198dcd6670a8ed75741872c3784a not found: ID does not exist" Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.478055 4792 scope.go:117] "RemoveContainer" containerID="832abdde1bd9df0dee6fd890cd41053134638610acc4bc9948b5dd0abc79f243" Mar 01 09:25:44 crc kubenswrapper[4792]: E0301 09:25:44.478555 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"832abdde1bd9df0dee6fd890cd41053134638610acc4bc9948b5dd0abc79f243\": container with ID starting with 832abdde1bd9df0dee6fd890cd41053134638610acc4bc9948b5dd0abc79f243 not found: ID does not exist" containerID="832abdde1bd9df0dee6fd890cd41053134638610acc4bc9948b5dd0abc79f243" Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.478583 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"832abdde1bd9df0dee6fd890cd41053134638610acc4bc9948b5dd0abc79f243"} err="failed to get container status \"832abdde1bd9df0dee6fd890cd41053134638610acc4bc9948b5dd0abc79f243\": rpc error: code = NotFound desc = could not find container \"832abdde1bd9df0dee6fd890cd41053134638610acc4bc9948b5dd0abc79f243\": container with ID starting with 832abdde1bd9df0dee6fd890cd41053134638610acc4bc9948b5dd0abc79f243 not found: ID does not exist" Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.504670 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ee54ba4-1afe-492b-a35b-23f0da447772-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5ee54ba4-1afe-492b-a35b-23f0da447772" (UID: "5ee54ba4-1afe-492b-a35b-23f0da447772"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.554954 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ee54ba4-1afe-492b-a35b-23f0da447772-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.555006 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ee54ba4-1afe-492b-a35b-23f0da447772-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.555021 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4c9l7\" (UniqueName: \"kubernetes.io/projected/5ee54ba4-1afe-492b-a35b-23f0da447772-kube-api-access-4c9l7\") on node \"crc\" DevicePath \"\"" Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.735186 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-76rvg"] Mar 01 09:25:44 crc kubenswrapper[4792]: I0301 09:25:44.742731 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-76rvg"] Mar 01 09:25:45 crc kubenswrapper[4792]: I0301 09:25:45.419702 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ee54ba4-1afe-492b-a35b-23f0da447772" path="/var/lib/kubelet/pods/5ee54ba4-1afe-492b-a35b-23f0da447772/volumes" Mar 01 09:25:53 crc kubenswrapper[4792]: I0301 09:25:53.984760 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-jdn6k" Mar 01 09:25:53 crc kubenswrapper[4792]: I0301 09:25:53.996251 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-64lkf" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.142862 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539286-l47hq"] Mar 01 09:26:00 crc kubenswrapper[4792]: E0301 09:26:00.148699 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a" containerName="extract-content" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.148744 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a" containerName="extract-content" Mar 01 09:26:00 crc kubenswrapper[4792]: E0301 09:26:00.148763 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15fa5cd2-57f2-4589-9947-c4a227fa68b6" containerName="extract-content" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.148773 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="15fa5cd2-57f2-4589-9947-c4a227fa68b6" containerName="extract-content" Mar 01 09:26:00 crc kubenswrapper[4792]: E0301 09:26:00.148784 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a" containerName="extract-utilities" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.148793 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a" containerName="extract-utilities" Mar 01 09:26:00 crc kubenswrapper[4792]: E0301 09:26:00.148830 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ee54ba4-1afe-492b-a35b-23f0da447772" containerName="extract-utilities" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.148839 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ee54ba4-1afe-492b-a35b-23f0da447772" containerName="extract-utilities" Mar 01 09:26:00 crc kubenswrapper[4792]: E0301 09:26:00.148852 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15fa5cd2-57f2-4589-9947-c4a227fa68b6" containerName="extract-utilities" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.148861 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="15fa5cd2-57f2-4589-9947-c4a227fa68b6" containerName="extract-utilities" Mar 01 09:26:00 crc kubenswrapper[4792]: E0301 09:26:00.148875 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a" containerName="registry-server" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.148919 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a" containerName="registry-server" Mar 01 09:26:00 crc kubenswrapper[4792]: E0301 09:26:00.148933 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15fa5cd2-57f2-4589-9947-c4a227fa68b6" containerName="registry-server" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.148941 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="15fa5cd2-57f2-4589-9947-c4a227fa68b6" containerName="registry-server" Mar 01 09:26:00 crc kubenswrapper[4792]: E0301 09:26:00.148956 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ee54ba4-1afe-492b-a35b-23f0da447772" containerName="registry-server" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.149021 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ee54ba4-1afe-492b-a35b-23f0da447772" containerName="registry-server" Mar 01 09:26:00 crc kubenswrapper[4792]: E0301 09:26:00.149039 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ee54ba4-1afe-492b-a35b-23f0da447772" containerName="extract-content" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.149047 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ee54ba4-1afe-492b-a35b-23f0da447772" containerName="extract-content" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.149297 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="15fa5cd2-57f2-4589-9947-c4a227fa68b6" containerName="registry-server" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.149337 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ee54ba4-1afe-492b-a35b-23f0da447772" containerName="registry-server" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.149353 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="02a5b2b3-e9cc-404a-a9e7-8c7a93d5ca5a" containerName="registry-server" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.150091 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539286-l47hq"] Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.150272 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539286-l47hq" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.153134 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.153511 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.153654 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.257562 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8rtl\" (UniqueName: \"kubernetes.io/projected/2eeb77af-03ae-4e32-80a6-3c16ed5ef64e-kube-api-access-g8rtl\") pod \"auto-csr-approver-29539286-l47hq\" (UID: \"2eeb77af-03ae-4e32-80a6-3c16ed5ef64e\") " pod="openshift-infra/auto-csr-approver-29539286-l47hq" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.359156 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8rtl\" (UniqueName: \"kubernetes.io/projected/2eeb77af-03ae-4e32-80a6-3c16ed5ef64e-kube-api-access-g8rtl\") pod \"auto-csr-approver-29539286-l47hq\" (UID: \"2eeb77af-03ae-4e32-80a6-3c16ed5ef64e\") " pod="openshift-infra/auto-csr-approver-29539286-l47hq" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.382612 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8rtl\" (UniqueName: \"kubernetes.io/projected/2eeb77af-03ae-4e32-80a6-3c16ed5ef64e-kube-api-access-g8rtl\") pod \"auto-csr-approver-29539286-l47hq\" (UID: \"2eeb77af-03ae-4e32-80a6-3c16ed5ef64e\") " pod="openshift-infra/auto-csr-approver-29539286-l47hq" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.467592 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539286-l47hq" Mar 01 09:26:00 crc kubenswrapper[4792]: I0301 09:26:00.924399 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539286-l47hq"] Mar 01 09:26:00 crc kubenswrapper[4792]: W0301 09:26:00.934741 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2eeb77af_03ae_4e32_80a6_3c16ed5ef64e.slice/crio-c19841951e9472daf35bdf79a2795bc24df43866e565d8780899803e8cd6f4ad WatchSource:0}: Error finding container c19841951e9472daf35bdf79a2795bc24df43866e565d8780899803e8cd6f4ad: Status 404 returned error can't find the container with id c19841951e9472daf35bdf79a2795bc24df43866e565d8780899803e8cd6f4ad Mar 01 09:26:01 crc kubenswrapper[4792]: I0301 09:26:01.520068 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539286-l47hq" event={"ID":"2eeb77af-03ae-4e32-80a6-3c16ed5ef64e","Type":"ContainerStarted","Data":"c19841951e9472daf35bdf79a2795bc24df43866e565d8780899803e8cd6f4ad"} Mar 01 09:26:02 crc kubenswrapper[4792]: I0301 09:26:02.528165 4792 generic.go:334] "Generic (PLEG): container finished" podID="2eeb77af-03ae-4e32-80a6-3c16ed5ef64e" containerID="30d57fe1f686a0e7d648422ad7801f657bc274b2e9502cf906d12a5e85e207f4" exitCode=0 Mar 01 09:26:02 crc kubenswrapper[4792]: I0301 09:26:02.528222 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539286-l47hq" event={"ID":"2eeb77af-03ae-4e32-80a6-3c16ed5ef64e","Type":"ContainerDied","Data":"30d57fe1f686a0e7d648422ad7801f657bc274b2e9502cf906d12a5e85e207f4"} Mar 01 09:26:03 crc kubenswrapper[4792]: I0301 09:26:03.786683 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539286-l47hq" Mar 01 09:26:03 crc kubenswrapper[4792]: I0301 09:26:03.912435 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8rtl\" (UniqueName: \"kubernetes.io/projected/2eeb77af-03ae-4e32-80a6-3c16ed5ef64e-kube-api-access-g8rtl\") pod \"2eeb77af-03ae-4e32-80a6-3c16ed5ef64e\" (UID: \"2eeb77af-03ae-4e32-80a6-3c16ed5ef64e\") " Mar 01 09:26:03 crc kubenswrapper[4792]: I0301 09:26:03.917523 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2eeb77af-03ae-4e32-80a6-3c16ed5ef64e-kube-api-access-g8rtl" (OuterVolumeSpecName: "kube-api-access-g8rtl") pod "2eeb77af-03ae-4e32-80a6-3c16ed5ef64e" (UID: "2eeb77af-03ae-4e32-80a6-3c16ed5ef64e"). InnerVolumeSpecName "kube-api-access-g8rtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:26:04 crc kubenswrapper[4792]: I0301 09:26:04.014305 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8rtl\" (UniqueName: \"kubernetes.io/projected/2eeb77af-03ae-4e32-80a6-3c16ed5ef64e-kube-api-access-g8rtl\") on node \"crc\" DevicePath \"\"" Mar 01 09:26:04 crc kubenswrapper[4792]: I0301 09:26:04.544362 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539286-l47hq" event={"ID":"2eeb77af-03ae-4e32-80a6-3c16ed5ef64e","Type":"ContainerDied","Data":"c19841951e9472daf35bdf79a2795bc24df43866e565d8780899803e8cd6f4ad"} Mar 01 09:26:04 crc kubenswrapper[4792]: I0301 09:26:04.544410 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c19841951e9472daf35bdf79a2795bc24df43866e565d8780899803e8cd6f4ad" Mar 01 09:26:04 crc kubenswrapper[4792]: I0301 09:26:04.544809 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539286-l47hq" Mar 01 09:26:04 crc kubenswrapper[4792]: I0301 09:26:04.867762 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539280-gz7v9"] Mar 01 09:26:04 crc kubenswrapper[4792]: I0301 09:26:04.874122 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539280-gz7v9"] Mar 01 09:26:04 crc kubenswrapper[4792]: I0301 09:26:04.943385 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:26:04 crc kubenswrapper[4792]: I0301 09:26:04.943455 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:26:05 crc kubenswrapper[4792]: I0301 09:26:05.416932 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71c922d5-9de8-48d8-9f96-ad47d1d4017e" path="/var/lib/kubelet/pods/71c922d5-9de8-48d8-9f96-ad47d1d4017e/volumes" Mar 01 09:26:16 crc kubenswrapper[4792]: I0301 09:26:16.753688 4792 scope.go:117] "RemoveContainer" containerID="752079600b535956d369c891a21eba391b40ef46c0f767fc3b0fcdc6ceb1bddc" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.558488 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-sqrnc"] Mar 01 09:26:18 crc kubenswrapper[4792]: E0301 09:26:18.558990 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eeb77af-03ae-4e32-80a6-3c16ed5ef64e" containerName="oc" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.559002 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eeb77af-03ae-4e32-80a6-3c16ed5ef64e" containerName="oc" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.559137 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eeb77af-03ae-4e32-80a6-3c16ed5ef64e" containerName="oc" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.559791 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-sqrnc" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.562385 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.562593 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.562807 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-glwmn" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.566245 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.569876 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-sqrnc"] Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.641412 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-brcdx"] Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.642552 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-brcdx" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.652924 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.662041 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-brcdx"] Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.705764 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40e11436-ae27-48c0-8baf-3f0aecc8e73c-config\") pod \"dnsmasq-dns-589db6c89c-sqrnc\" (UID: \"40e11436-ae27-48c0-8baf-3f0aecc8e73c\") " pod="openstack/dnsmasq-dns-589db6c89c-sqrnc" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.706055 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-brcdx\" (UID: \"bf78c5d7-c647-4862-bc7b-e14f8de9ef0f\") " pod="openstack/dnsmasq-dns-86bbd886cf-brcdx" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.706195 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cczf7\" (UniqueName: \"kubernetes.io/projected/40e11436-ae27-48c0-8baf-3f0aecc8e73c-kube-api-access-cczf7\") pod \"dnsmasq-dns-589db6c89c-sqrnc\" (UID: \"40e11436-ae27-48c0-8baf-3f0aecc8e73c\") " pod="openstack/dnsmasq-dns-589db6c89c-sqrnc" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.706340 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f-config\") pod \"dnsmasq-dns-86bbd886cf-brcdx\" (UID: \"bf78c5d7-c647-4862-bc7b-e14f8de9ef0f\") " pod="openstack/dnsmasq-dns-86bbd886cf-brcdx" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.706450 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q27bj\" (UniqueName: \"kubernetes.io/projected/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f-kube-api-access-q27bj\") pod \"dnsmasq-dns-86bbd886cf-brcdx\" (UID: \"bf78c5d7-c647-4862-bc7b-e14f8de9ef0f\") " pod="openstack/dnsmasq-dns-86bbd886cf-brcdx" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.808124 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f-config\") pod \"dnsmasq-dns-86bbd886cf-brcdx\" (UID: \"bf78c5d7-c647-4862-bc7b-e14f8de9ef0f\") " pod="openstack/dnsmasq-dns-86bbd886cf-brcdx" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.808458 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q27bj\" (UniqueName: \"kubernetes.io/projected/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f-kube-api-access-q27bj\") pod \"dnsmasq-dns-86bbd886cf-brcdx\" (UID: \"bf78c5d7-c647-4862-bc7b-e14f8de9ef0f\") " pod="openstack/dnsmasq-dns-86bbd886cf-brcdx" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.808708 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40e11436-ae27-48c0-8baf-3f0aecc8e73c-config\") pod \"dnsmasq-dns-589db6c89c-sqrnc\" (UID: \"40e11436-ae27-48c0-8baf-3f0aecc8e73c\") " pod="openstack/dnsmasq-dns-589db6c89c-sqrnc" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.808827 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-brcdx\" (UID: \"bf78c5d7-c647-4862-bc7b-e14f8de9ef0f\") " pod="openstack/dnsmasq-dns-86bbd886cf-brcdx" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.809046 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cczf7\" (UniqueName: \"kubernetes.io/projected/40e11436-ae27-48c0-8baf-3f0aecc8e73c-kube-api-access-cczf7\") pod \"dnsmasq-dns-589db6c89c-sqrnc\" (UID: \"40e11436-ae27-48c0-8baf-3f0aecc8e73c\") " pod="openstack/dnsmasq-dns-589db6c89c-sqrnc" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.809483 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f-config\") pod \"dnsmasq-dns-86bbd886cf-brcdx\" (UID: \"bf78c5d7-c647-4862-bc7b-e14f8de9ef0f\") " pod="openstack/dnsmasq-dns-86bbd886cf-brcdx" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.810362 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40e11436-ae27-48c0-8baf-3f0aecc8e73c-config\") pod \"dnsmasq-dns-589db6c89c-sqrnc\" (UID: \"40e11436-ae27-48c0-8baf-3f0aecc8e73c\") " pod="openstack/dnsmasq-dns-589db6c89c-sqrnc" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.811005 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-brcdx\" (UID: \"bf78c5d7-c647-4862-bc7b-e14f8de9ef0f\") " pod="openstack/dnsmasq-dns-86bbd886cf-brcdx" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.828119 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cczf7\" (UniqueName: \"kubernetes.io/projected/40e11436-ae27-48c0-8baf-3f0aecc8e73c-kube-api-access-cczf7\") pod \"dnsmasq-dns-589db6c89c-sqrnc\" (UID: \"40e11436-ae27-48c0-8baf-3f0aecc8e73c\") " pod="openstack/dnsmasq-dns-589db6c89c-sqrnc" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.830767 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q27bj\" (UniqueName: \"kubernetes.io/projected/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f-kube-api-access-q27bj\") pod \"dnsmasq-dns-86bbd886cf-brcdx\" (UID: \"bf78c5d7-c647-4862-bc7b-e14f8de9ef0f\") " pod="openstack/dnsmasq-dns-86bbd886cf-brcdx" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.928165 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-sqrnc" Mar 01 09:26:18 crc kubenswrapper[4792]: I0301 09:26:18.961685 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-brcdx" Mar 01 09:26:19 crc kubenswrapper[4792]: I0301 09:26:19.386646 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-sqrnc"] Mar 01 09:26:19 crc kubenswrapper[4792]: I0301 09:26:19.444545 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-brcdx"] Mar 01 09:26:19 crc kubenswrapper[4792]: W0301 09:26:19.450261 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf78c5d7_c647_4862_bc7b_e14f8de9ef0f.slice/crio-8e90037762d911668e790efa9d44feb3d2539a5a373a55b8095279629113a42e WatchSource:0}: Error finding container 8e90037762d911668e790efa9d44feb3d2539a5a373a55b8095279629113a42e: Status 404 returned error can't find the container with id 8e90037762d911668e790efa9d44feb3d2539a5a373a55b8095279629113a42e Mar 01 09:26:19 crc kubenswrapper[4792]: I0301 09:26:19.632372 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589db6c89c-sqrnc" event={"ID":"40e11436-ae27-48c0-8baf-3f0aecc8e73c","Type":"ContainerStarted","Data":"89744d9bb053371289c3f391707b9cba897bfe50eed733d33f5f66e5f68036ab"} Mar 01 09:26:19 crc kubenswrapper[4792]: I0301 09:26:19.633538 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bbd886cf-brcdx" event={"ID":"bf78c5d7-c647-4862-bc7b-e14f8de9ef0f","Type":"ContainerStarted","Data":"8e90037762d911668e790efa9d44feb3d2539a5a373a55b8095279629113a42e"} Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.354354 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-sqrnc"] Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.388770 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-czwcl"] Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.390001 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-czwcl" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.404212 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-czwcl"] Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.442824 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8a60915-b57b-4331-ad0e-b671ff576a69-dns-svc\") pod \"dnsmasq-dns-78cb4465c9-czwcl\" (UID: \"f8a60915-b57b-4331-ad0e-b671ff576a69\") " pod="openstack/dnsmasq-dns-78cb4465c9-czwcl" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.443135 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8a60915-b57b-4331-ad0e-b671ff576a69-config\") pod \"dnsmasq-dns-78cb4465c9-czwcl\" (UID: \"f8a60915-b57b-4331-ad0e-b671ff576a69\") " pod="openstack/dnsmasq-dns-78cb4465c9-czwcl" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.443248 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws769\" (UniqueName: \"kubernetes.io/projected/f8a60915-b57b-4331-ad0e-b671ff576a69-kube-api-access-ws769\") pod \"dnsmasq-dns-78cb4465c9-czwcl\" (UID: \"f8a60915-b57b-4331-ad0e-b671ff576a69\") " pod="openstack/dnsmasq-dns-78cb4465c9-czwcl" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.546087 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8a60915-b57b-4331-ad0e-b671ff576a69-dns-svc\") pod \"dnsmasq-dns-78cb4465c9-czwcl\" (UID: \"f8a60915-b57b-4331-ad0e-b671ff576a69\") " pod="openstack/dnsmasq-dns-78cb4465c9-czwcl" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.546149 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws769\" (UniqueName: \"kubernetes.io/projected/f8a60915-b57b-4331-ad0e-b671ff576a69-kube-api-access-ws769\") pod \"dnsmasq-dns-78cb4465c9-czwcl\" (UID: \"f8a60915-b57b-4331-ad0e-b671ff576a69\") " pod="openstack/dnsmasq-dns-78cb4465c9-czwcl" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.546180 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8a60915-b57b-4331-ad0e-b671ff576a69-config\") pod \"dnsmasq-dns-78cb4465c9-czwcl\" (UID: \"f8a60915-b57b-4331-ad0e-b671ff576a69\") " pod="openstack/dnsmasq-dns-78cb4465c9-czwcl" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.548396 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8a60915-b57b-4331-ad0e-b671ff576a69-dns-svc\") pod \"dnsmasq-dns-78cb4465c9-czwcl\" (UID: \"f8a60915-b57b-4331-ad0e-b671ff576a69\") " pod="openstack/dnsmasq-dns-78cb4465c9-czwcl" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.550428 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8a60915-b57b-4331-ad0e-b671ff576a69-config\") pod \"dnsmasq-dns-78cb4465c9-czwcl\" (UID: \"f8a60915-b57b-4331-ad0e-b671ff576a69\") " pod="openstack/dnsmasq-dns-78cb4465c9-czwcl" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.589094 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws769\" (UniqueName: \"kubernetes.io/projected/f8a60915-b57b-4331-ad0e-b671ff576a69-kube-api-access-ws769\") pod \"dnsmasq-dns-78cb4465c9-czwcl\" (UID: \"f8a60915-b57b-4331-ad0e-b671ff576a69\") " pod="openstack/dnsmasq-dns-78cb4465c9-czwcl" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.711559 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-brcdx"] Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.715065 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-czwcl" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.743593 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-xrm9m"] Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.745093 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-xrm9m" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.770799 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-xrm9m"] Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.859146 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fwv9\" (UniqueName: \"kubernetes.io/projected/1ebfee14-6440-476a-87ff-fc933df3eaa8-kube-api-access-6fwv9\") pod \"dnsmasq-dns-7c47bcb9f9-xrm9m\" (UID: \"1ebfee14-6440-476a-87ff-fc933df3eaa8\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-xrm9m" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.859191 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ebfee14-6440-476a-87ff-fc933df3eaa8-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-xrm9m\" (UID: \"1ebfee14-6440-476a-87ff-fc933df3eaa8\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-xrm9m" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.859275 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ebfee14-6440-476a-87ff-fc933df3eaa8-config\") pod \"dnsmasq-dns-7c47bcb9f9-xrm9m\" (UID: \"1ebfee14-6440-476a-87ff-fc933df3eaa8\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-xrm9m" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.960042 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ebfee14-6440-476a-87ff-fc933df3eaa8-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-xrm9m\" (UID: \"1ebfee14-6440-476a-87ff-fc933df3eaa8\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-xrm9m" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.960287 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ebfee14-6440-476a-87ff-fc933df3eaa8-config\") pod \"dnsmasq-dns-7c47bcb9f9-xrm9m\" (UID: \"1ebfee14-6440-476a-87ff-fc933df3eaa8\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-xrm9m" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.960365 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fwv9\" (UniqueName: \"kubernetes.io/projected/1ebfee14-6440-476a-87ff-fc933df3eaa8-kube-api-access-6fwv9\") pod \"dnsmasq-dns-7c47bcb9f9-xrm9m\" (UID: \"1ebfee14-6440-476a-87ff-fc933df3eaa8\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-xrm9m" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.961567 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ebfee14-6440-476a-87ff-fc933df3eaa8-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-xrm9m\" (UID: \"1ebfee14-6440-476a-87ff-fc933df3eaa8\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-xrm9m" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.965090 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ebfee14-6440-476a-87ff-fc933df3eaa8-config\") pod \"dnsmasq-dns-7c47bcb9f9-xrm9m\" (UID: \"1ebfee14-6440-476a-87ff-fc933df3eaa8\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-xrm9m" Mar 01 09:26:21 crc kubenswrapper[4792]: I0301 09:26:21.987187 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fwv9\" (UniqueName: \"kubernetes.io/projected/1ebfee14-6440-476a-87ff-fc933df3eaa8-kube-api-access-6fwv9\") pod \"dnsmasq-dns-7c47bcb9f9-xrm9m\" (UID: \"1ebfee14-6440-476a-87ff-fc933df3eaa8\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-xrm9m" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.089344 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-xrm9m" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.140277 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-czwcl"] Mar 01 09:26:22 crc kubenswrapper[4792]: W0301 09:26:22.150267 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8a60915_b57b_4331_ad0e_b671ff576a69.slice/crio-67be12a989d9394e207bd8944b8cd6294460a40d572fd7dffd8e27ec9d363809 WatchSource:0}: Error finding container 67be12a989d9394e207bd8944b8cd6294460a40d572fd7dffd8e27ec9d363809: Status 404 returned error can't find the container with id 67be12a989d9394e207bd8944b8cd6294460a40d572fd7dffd8e27ec9d363809 Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.542368 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.543522 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.548766 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.548950 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.549064 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.549158 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.549312 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.549324 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-584kl" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.550596 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.564180 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.620922 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-xrm9m"] Mar 01 09:26:22 crc kubenswrapper[4792]: W0301 09:26:22.631748 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ebfee14_6440_476a_87ff_fc933df3eaa8.slice/crio-dfb18f2a46a9a4bd99a6d3ee488952b80257c666cf6b574e91f2c62a39fd84f6 WatchSource:0}: Error finding container dfb18f2a46a9a4bd99a6d3ee488952b80257c666cf6b574e91f2c62a39fd84f6: Status 404 returned error can't find the container with id dfb18f2a46a9a4bd99a6d3ee488952b80257c666cf6b574e91f2c62a39fd84f6 Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.671764 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-xrm9m" event={"ID":"1ebfee14-6440-476a-87ff-fc933df3eaa8","Type":"ContainerStarted","Data":"dfb18f2a46a9a4bd99a6d3ee488952b80257c666cf6b574e91f2c62a39fd84f6"} Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.673452 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-czwcl" event={"ID":"f8a60915-b57b-4331-ad0e-b671ff576a69","Type":"ContainerStarted","Data":"67be12a989d9394e207bd8944b8cd6294460a40d572fd7dffd8e27ec9d363809"} Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.675336 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.675384 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.675413 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.675439 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.675477 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.675524 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.675547 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.675596 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.675629 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.675659 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8fhc\" (UniqueName: \"kubernetes.io/projected/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-kube-api-access-q8fhc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.675834 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.777433 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.777475 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.777501 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.777532 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.777553 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.777579 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.777602 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.777618 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.777652 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.777687 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.777706 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8fhc\" (UniqueName: \"kubernetes.io/projected/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-kube-api-access-q8fhc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.779466 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.780165 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.780195 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.781067 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.781235 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.782003 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.793276 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.793356 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.796412 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.805092 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8fhc\" (UniqueName: \"kubernetes.io/projected/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-kube-api-access-q8fhc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.811158 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.825069 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.882556 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.891606 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.895936 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.899444 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.899656 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.899870 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-5zwb6" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.899708 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.900081 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.900310 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.909144 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.925672 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.984303 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.984406 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.984449 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnscv\" (UniqueName: \"kubernetes.io/projected/6252a079-917c-46e8-a848-10569e1e057e-kube-api-access-qnscv\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.984482 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6252a079-917c-46e8-a848-10569e1e057e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.984500 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6252a079-917c-46e8-a848-10569e1e057e-config-data\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.984536 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6252a079-917c-46e8-a848-10569e1e057e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.984562 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.984575 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6252a079-917c-46e8-a848-10569e1e057e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.984601 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.984638 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6252a079-917c-46e8-a848-10569e1e057e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:22 crc kubenswrapper[4792]: I0301 09:26:22.984654 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.086099 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.086139 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.086192 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6252a079-917c-46e8-a848-10569e1e057e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.086278 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.088525 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6252a079-917c-46e8-a848-10569e1e057e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.088596 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.088688 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.088785 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.088831 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnscv\" (UniqueName: \"kubernetes.io/projected/6252a079-917c-46e8-a848-10569e1e057e-kube-api-access-qnscv\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.088878 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6252a079-917c-46e8-a848-10569e1e057e-config-data\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.089090 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6252a079-917c-46e8-a848-10569e1e057e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.089101 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.089120 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6252a079-917c-46e8-a848-10569e1e057e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.089406 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.089783 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6252a079-917c-46e8-a848-10569e1e057e-config-data\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.090581 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6252a079-917c-46e8-a848-10569e1e057e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.090785 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6252a079-917c-46e8-a848-10569e1e057e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.098429 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.100756 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.106677 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6252a079-917c-46e8-a848-10569e1e057e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.107564 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnscv\" (UniqueName: \"kubernetes.io/projected/6252a079-917c-46e8-a848-10569e1e057e-kube-api-access-qnscv\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.111770 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6252a079-917c-46e8-a848-10569e1e057e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.114970 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.230667 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.587254 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.686356 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24","Type":"ContainerStarted","Data":"3e8de91b3c58261b32cbdb52401a16acdc8aa762850b0b7a587dfa85e98e1d6e"} Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.860566 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 01 09:26:23 crc kubenswrapper[4792]: W0301 09:26:23.861177 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6252a079_917c_46e8_a848_10569e1e057e.slice/crio-dda196254ac808b630ce64a84572b51fae2b42596880acb14f053e9eec301086 WatchSource:0}: Error finding container dda196254ac808b630ce64a84572b51fae2b42596880acb14f053e9eec301086: Status 404 returned error can't find the container with id dda196254ac808b630ce64a84572b51fae2b42596880acb14f053e9eec301086 Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.995391 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 01 09:26:23 crc kubenswrapper[4792]: I0301 09:26:23.997094 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:23.998531 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-cnt9x" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.005949 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.006145 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.006276 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.006729 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.008879 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.120207 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b969e6eb-14a7-4e45-8342-ccbd05c06261-config-data-default\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.120250 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b969e6eb-14a7-4e45-8342-ccbd05c06261-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.120284 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b969e6eb-14a7-4e45-8342-ccbd05c06261-kolla-config\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.120300 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b969e6eb-14a7-4e45-8342-ccbd05c06261-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.120420 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b969e6eb-14a7-4e45-8342-ccbd05c06261-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.120503 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.120531 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzb9z\" (UniqueName: \"kubernetes.io/projected/b969e6eb-14a7-4e45-8342-ccbd05c06261-kube-api-access-zzb9z\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.120550 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b969e6eb-14a7-4e45-8342-ccbd05c06261-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.222853 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b969e6eb-14a7-4e45-8342-ccbd05c06261-config-data-default\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.222937 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b969e6eb-14a7-4e45-8342-ccbd05c06261-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.222982 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b969e6eb-14a7-4e45-8342-ccbd05c06261-kolla-config\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.223003 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b969e6eb-14a7-4e45-8342-ccbd05c06261-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.223031 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b969e6eb-14a7-4e45-8342-ccbd05c06261-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.223077 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.223112 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzb9z\" (UniqueName: \"kubernetes.io/projected/b969e6eb-14a7-4e45-8342-ccbd05c06261-kube-api-access-zzb9z\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.223134 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b969e6eb-14a7-4e45-8342-ccbd05c06261-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.223639 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b969e6eb-14a7-4e45-8342-ccbd05c06261-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.224025 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.224767 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b969e6eb-14a7-4e45-8342-ccbd05c06261-config-data-default\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.229784 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b969e6eb-14a7-4e45-8342-ccbd05c06261-kolla-config\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.231569 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b969e6eb-14a7-4e45-8342-ccbd05c06261-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.241141 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b969e6eb-14a7-4e45-8342-ccbd05c06261-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.248963 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzb9z\" (UniqueName: \"kubernetes.io/projected/b969e6eb-14a7-4e45-8342-ccbd05c06261-kube-api-access-zzb9z\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.254544 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b969e6eb-14a7-4e45-8342-ccbd05c06261-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.277405 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"b969e6eb-14a7-4e45-8342-ccbd05c06261\") " pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.332819 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.782298 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6252a079-917c-46e8-a848-10569e1e057e","Type":"ContainerStarted","Data":"dda196254ac808b630ce64a84572b51fae2b42596880acb14f053e9eec301086"} Mar 01 09:26:24 crc kubenswrapper[4792]: I0301 09:26:24.997018 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 01 09:26:25 crc kubenswrapper[4792]: W0301 09:26:25.023168 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb969e6eb_14a7_4e45_8342_ccbd05c06261.slice/crio-afc0801880f737b0b1ad27c6152a8ab04e9874208174641d073668979b7169df WatchSource:0}: Error finding container afc0801880f737b0b1ad27c6152a8ab04e9874208174641d073668979b7169df: Status 404 returned error can't find the container with id afc0801880f737b0b1ad27c6152a8ab04e9874208174641d073668979b7169df Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.375283 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.377992 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.380371 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.381154 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.381199 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-7wbzt" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.382614 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.385568 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.545383 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.545430 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f2d03d42-7830-444b-a8ae-c91e16d352b9-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.545494 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2d03d42-7830-444b-a8ae-c91e16d352b9-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.545524 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f2d03d42-7830-444b-a8ae-c91e16d352b9-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.545555 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9gx9\" (UniqueName: \"kubernetes.io/projected/f2d03d42-7830-444b-a8ae-c91e16d352b9-kube-api-access-p9gx9\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.545579 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2d03d42-7830-444b-a8ae-c91e16d352b9-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.545654 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f2d03d42-7830-444b-a8ae-c91e16d352b9-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.545677 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2d03d42-7830-444b-a8ae-c91e16d352b9-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.612656 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.613636 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.614423 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.617637 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.617786 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-fwwfv" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.617898 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.646462 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2d03d42-7830-444b-a8ae-c91e16d352b9-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.646506 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f2d03d42-7830-444b-a8ae-c91e16d352b9-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.646543 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9gx9\" (UniqueName: \"kubernetes.io/projected/f2d03d42-7830-444b-a8ae-c91e16d352b9-kube-api-access-p9gx9\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.646572 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2d03d42-7830-444b-a8ae-c91e16d352b9-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.646619 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f2d03d42-7830-444b-a8ae-c91e16d352b9-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.646644 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2d03d42-7830-444b-a8ae-c91e16d352b9-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.646673 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.646689 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f2d03d42-7830-444b-a8ae-c91e16d352b9-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.647996 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f2d03d42-7830-444b-a8ae-c91e16d352b9-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.648250 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f2d03d42-7830-444b-a8ae-c91e16d352b9-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.648628 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f2d03d42-7830-444b-a8ae-c91e16d352b9-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.648778 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.652838 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2d03d42-7830-444b-a8ae-c91e16d352b9-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.665285 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2d03d42-7830-444b-a8ae-c91e16d352b9-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.676148 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2d03d42-7830-444b-a8ae-c91e16d352b9-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.688263 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.688951 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9gx9\" (UniqueName: \"kubernetes.io/projected/f2d03d42-7830-444b-a8ae-c91e16d352b9-kube-api-access-p9gx9\") pod \"openstack-cell1-galera-0\" (UID: \"f2d03d42-7830-444b-a8ae-c91e16d352b9\") " pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.718563 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.749087 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/84d455ad-7bbb-4771-a8ed-9aa1984e1d40-memcached-tls-certs\") pod \"memcached-0\" (UID: \"84d455ad-7bbb-4771-a8ed-9aa1984e1d40\") " pod="openstack/memcached-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.749158 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/84d455ad-7bbb-4771-a8ed-9aa1984e1d40-kolla-config\") pod \"memcached-0\" (UID: \"84d455ad-7bbb-4771-a8ed-9aa1984e1d40\") " pod="openstack/memcached-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.749185 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84d455ad-7bbb-4771-a8ed-9aa1984e1d40-combined-ca-bundle\") pod \"memcached-0\" (UID: \"84d455ad-7bbb-4771-a8ed-9aa1984e1d40\") " pod="openstack/memcached-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.749222 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j72rb\" (UniqueName: \"kubernetes.io/projected/84d455ad-7bbb-4771-a8ed-9aa1984e1d40-kube-api-access-j72rb\") pod \"memcached-0\" (UID: \"84d455ad-7bbb-4771-a8ed-9aa1984e1d40\") " pod="openstack/memcached-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.749304 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/84d455ad-7bbb-4771-a8ed-9aa1984e1d40-config-data\") pod \"memcached-0\" (UID: \"84d455ad-7bbb-4771-a8ed-9aa1984e1d40\") " pod="openstack/memcached-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.799397 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b969e6eb-14a7-4e45-8342-ccbd05c06261","Type":"ContainerStarted","Data":"afc0801880f737b0b1ad27c6152a8ab04e9874208174641d073668979b7169df"} Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.850601 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/84d455ad-7bbb-4771-a8ed-9aa1984e1d40-memcached-tls-certs\") pod \"memcached-0\" (UID: \"84d455ad-7bbb-4771-a8ed-9aa1984e1d40\") " pod="openstack/memcached-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.850661 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/84d455ad-7bbb-4771-a8ed-9aa1984e1d40-kolla-config\") pod \"memcached-0\" (UID: \"84d455ad-7bbb-4771-a8ed-9aa1984e1d40\") " pod="openstack/memcached-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.850684 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84d455ad-7bbb-4771-a8ed-9aa1984e1d40-combined-ca-bundle\") pod \"memcached-0\" (UID: \"84d455ad-7bbb-4771-a8ed-9aa1984e1d40\") " pod="openstack/memcached-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.850719 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j72rb\" (UniqueName: \"kubernetes.io/projected/84d455ad-7bbb-4771-a8ed-9aa1984e1d40-kube-api-access-j72rb\") pod \"memcached-0\" (UID: \"84d455ad-7bbb-4771-a8ed-9aa1984e1d40\") " pod="openstack/memcached-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.850776 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/84d455ad-7bbb-4771-a8ed-9aa1984e1d40-config-data\") pod \"memcached-0\" (UID: \"84d455ad-7bbb-4771-a8ed-9aa1984e1d40\") " pod="openstack/memcached-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.851823 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/84d455ad-7bbb-4771-a8ed-9aa1984e1d40-config-data\") pod \"memcached-0\" (UID: \"84d455ad-7bbb-4771-a8ed-9aa1984e1d40\") " pod="openstack/memcached-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.852315 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/84d455ad-7bbb-4771-a8ed-9aa1984e1d40-kolla-config\") pod \"memcached-0\" (UID: \"84d455ad-7bbb-4771-a8ed-9aa1984e1d40\") " pod="openstack/memcached-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.865532 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84d455ad-7bbb-4771-a8ed-9aa1984e1d40-combined-ca-bundle\") pod \"memcached-0\" (UID: \"84d455ad-7bbb-4771-a8ed-9aa1984e1d40\") " pod="openstack/memcached-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.865543 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/84d455ad-7bbb-4771-a8ed-9aa1984e1d40-memcached-tls-certs\") pod \"memcached-0\" (UID: \"84d455ad-7bbb-4771-a8ed-9aa1984e1d40\") " pod="openstack/memcached-0" Mar 01 09:26:25 crc kubenswrapper[4792]: I0301 09:26:25.893874 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j72rb\" (UniqueName: \"kubernetes.io/projected/84d455ad-7bbb-4771-a8ed-9aa1984e1d40-kube-api-access-j72rb\") pod \"memcached-0\" (UID: \"84d455ad-7bbb-4771-a8ed-9aa1984e1d40\") " pod="openstack/memcached-0" Mar 01 09:26:26 crc kubenswrapper[4792]: I0301 09:26:26.025095 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 01 09:26:26 crc kubenswrapper[4792]: I0301 09:26:26.458527 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 01 09:26:26 crc kubenswrapper[4792]: W0301 09:26:26.466543 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2d03d42_7830_444b_a8ae_c91e16d352b9.slice/crio-03d3c0f01b804fa527d0850dc14fc1a1b576cac0905f5fb14b0077932be52f81 WatchSource:0}: Error finding container 03d3c0f01b804fa527d0850dc14fc1a1b576cac0905f5fb14b0077932be52f81: Status 404 returned error can't find the container with id 03d3c0f01b804fa527d0850dc14fc1a1b576cac0905f5fb14b0077932be52f81 Mar 01 09:26:26 crc kubenswrapper[4792]: I0301 09:26:26.668806 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 01 09:26:26 crc kubenswrapper[4792]: W0301 09:26:26.713577 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84d455ad_7bbb_4771_a8ed_9aa1984e1d40.slice/crio-66a8dfae8907fc1609feadd743952be29a2aa99fa175799dda640835c77de4d2 WatchSource:0}: Error finding container 66a8dfae8907fc1609feadd743952be29a2aa99fa175799dda640835c77de4d2: Status 404 returned error can't find the container with id 66a8dfae8907fc1609feadd743952be29a2aa99fa175799dda640835c77de4d2 Mar 01 09:26:26 crc kubenswrapper[4792]: I0301 09:26:26.818990 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f2d03d42-7830-444b-a8ae-c91e16d352b9","Type":"ContainerStarted","Data":"03d3c0f01b804fa527d0850dc14fc1a1b576cac0905f5fb14b0077932be52f81"} Mar 01 09:26:26 crc kubenswrapper[4792]: I0301 09:26:26.825130 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"84d455ad-7bbb-4771-a8ed-9aa1984e1d40","Type":"ContainerStarted","Data":"66a8dfae8907fc1609feadd743952be29a2aa99fa175799dda640835c77de4d2"} Mar 01 09:26:28 crc kubenswrapper[4792]: I0301 09:26:28.211440 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 01 09:26:28 crc kubenswrapper[4792]: I0301 09:26:28.212739 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 01 09:26:28 crc kubenswrapper[4792]: I0301 09:26:28.216073 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-nc787" Mar 01 09:26:28 crc kubenswrapper[4792]: I0301 09:26:28.217456 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 01 09:26:28 crc kubenswrapper[4792]: I0301 09:26:28.311332 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5xkf\" (UniqueName: \"kubernetes.io/projected/c5db40bf-18aa-4877-ad92-35d50c549309-kube-api-access-t5xkf\") pod \"kube-state-metrics-0\" (UID: \"c5db40bf-18aa-4877-ad92-35d50c549309\") " pod="openstack/kube-state-metrics-0" Mar 01 09:26:28 crc kubenswrapper[4792]: I0301 09:26:28.413739 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5xkf\" (UniqueName: \"kubernetes.io/projected/c5db40bf-18aa-4877-ad92-35d50c549309-kube-api-access-t5xkf\") pod \"kube-state-metrics-0\" (UID: \"c5db40bf-18aa-4877-ad92-35d50c549309\") " pod="openstack/kube-state-metrics-0" Mar 01 09:26:28 crc kubenswrapper[4792]: I0301 09:26:28.436950 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5xkf\" (UniqueName: \"kubernetes.io/projected/c5db40bf-18aa-4877-ad92-35d50c549309-kube-api-access-t5xkf\") pod \"kube-state-metrics-0\" (UID: \"c5db40bf-18aa-4877-ad92-35d50c549309\") " pod="openstack/kube-state-metrics-0" Mar 01 09:26:28 crc kubenswrapper[4792]: I0301 09:26:28.548777 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 01 09:26:29 crc kubenswrapper[4792]: I0301 09:26:29.089054 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 01 09:26:30 crc kubenswrapper[4792]: I0301 09:26:29.865499 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c5db40bf-18aa-4877-ad92-35d50c549309","Type":"ContainerStarted","Data":"d3de3b349ed8682aadffbec7a09f7bd847d16614859016d386affe481743f302"} Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.196604 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-mpvqc"] Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.197814 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.201592 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-s2dfs" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.201794 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.201937 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.223656 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mpvqc"] Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.235101 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-nfzrr"] Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.236848 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.261968 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-nfzrr"] Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.369784 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4sph\" (UniqueName: \"kubernetes.io/projected/22d78adc-2ff6-4f03-b60e-ac8e9a0f3699-kube-api-access-p4sph\") pod \"ovn-controller-ovs-nfzrr\" (UID: \"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699\") " pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.369837 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22d78adc-2ff6-4f03-b60e-ac8e9a0f3699-scripts\") pod \"ovn-controller-ovs-nfzrr\" (UID: \"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699\") " pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.369928 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d50ee3b1-4f97-4644-802d-04c85d9c3abc-var-log-ovn\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.369949 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/22d78adc-2ff6-4f03-b60e-ac8e9a0f3699-var-run\") pod \"ovn-controller-ovs-nfzrr\" (UID: \"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699\") " pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.370012 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d50ee3b1-4f97-4644-802d-04c85d9c3abc-combined-ca-bundle\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.370109 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d50ee3b1-4f97-4644-802d-04c85d9c3abc-scripts\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.370143 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/22d78adc-2ff6-4f03-b60e-ac8e9a0f3699-var-log\") pod \"ovn-controller-ovs-nfzrr\" (UID: \"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699\") " pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.370174 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/22d78adc-2ff6-4f03-b60e-ac8e9a0f3699-var-lib\") pod \"ovn-controller-ovs-nfzrr\" (UID: \"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699\") " pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.370194 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/22d78adc-2ff6-4f03-b60e-ac8e9a0f3699-etc-ovs\") pod \"ovn-controller-ovs-nfzrr\" (UID: \"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699\") " pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.370214 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d50ee3b1-4f97-4644-802d-04c85d9c3abc-ovn-controller-tls-certs\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.370279 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d50ee3b1-4f97-4644-802d-04c85d9c3abc-var-run-ovn\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.370302 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffh47\" (UniqueName: \"kubernetes.io/projected/d50ee3b1-4f97-4644-802d-04c85d9c3abc-kube-api-access-ffh47\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.370342 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d50ee3b1-4f97-4644-802d-04c85d9c3abc-var-run\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.472168 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22d78adc-2ff6-4f03-b60e-ac8e9a0f3699-scripts\") pod \"ovn-controller-ovs-nfzrr\" (UID: \"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699\") " pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.472315 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d50ee3b1-4f97-4644-802d-04c85d9c3abc-var-log-ovn\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.472339 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d50ee3b1-4f97-4644-802d-04c85d9c3abc-combined-ca-bundle\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.472358 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/22d78adc-2ff6-4f03-b60e-ac8e9a0f3699-var-run\") pod \"ovn-controller-ovs-nfzrr\" (UID: \"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699\") " pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.472386 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d50ee3b1-4f97-4644-802d-04c85d9c3abc-scripts\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.472404 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/22d78adc-2ff6-4f03-b60e-ac8e9a0f3699-var-log\") pod \"ovn-controller-ovs-nfzrr\" (UID: \"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699\") " pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.472428 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/22d78adc-2ff6-4f03-b60e-ac8e9a0f3699-var-lib\") pod \"ovn-controller-ovs-nfzrr\" (UID: \"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699\") " pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.472445 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/22d78adc-2ff6-4f03-b60e-ac8e9a0f3699-etc-ovs\") pod \"ovn-controller-ovs-nfzrr\" (UID: \"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699\") " pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.472462 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d50ee3b1-4f97-4644-802d-04c85d9c3abc-ovn-controller-tls-certs\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.472493 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d50ee3b1-4f97-4644-802d-04c85d9c3abc-var-run-ovn\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.472509 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffh47\" (UniqueName: \"kubernetes.io/projected/d50ee3b1-4f97-4644-802d-04c85d9c3abc-kube-api-access-ffh47\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.472533 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d50ee3b1-4f97-4644-802d-04c85d9c3abc-var-run\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.472557 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4sph\" (UniqueName: \"kubernetes.io/projected/22d78adc-2ff6-4f03-b60e-ac8e9a0f3699-kube-api-access-p4sph\") pod \"ovn-controller-ovs-nfzrr\" (UID: \"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699\") " pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.472870 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/22d78adc-2ff6-4f03-b60e-ac8e9a0f3699-var-run\") pod \"ovn-controller-ovs-nfzrr\" (UID: \"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699\") " pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.472967 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d50ee3b1-4f97-4644-802d-04c85d9c3abc-var-log-ovn\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.474147 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/22d78adc-2ff6-4f03-b60e-ac8e9a0f3699-var-lib\") pod \"ovn-controller-ovs-nfzrr\" (UID: \"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699\") " pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.474291 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/22d78adc-2ff6-4f03-b60e-ac8e9a0f3699-var-log\") pod \"ovn-controller-ovs-nfzrr\" (UID: \"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699\") " pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.474428 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d50ee3b1-4f97-4644-802d-04c85d9c3abc-var-run\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.474540 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/22d78adc-2ff6-4f03-b60e-ac8e9a0f3699-etc-ovs\") pod \"ovn-controller-ovs-nfzrr\" (UID: \"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699\") " pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.474580 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d50ee3b1-4f97-4644-802d-04c85d9c3abc-var-run-ovn\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.474848 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d50ee3b1-4f97-4644-802d-04c85d9c3abc-scripts\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.475973 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22d78adc-2ff6-4f03-b60e-ac8e9a0f3699-scripts\") pod \"ovn-controller-ovs-nfzrr\" (UID: \"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699\") " pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.480633 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d50ee3b1-4f97-4644-802d-04c85d9c3abc-ovn-controller-tls-certs\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.480674 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d50ee3b1-4f97-4644-802d-04c85d9c3abc-combined-ca-bundle\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.497518 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffh47\" (UniqueName: \"kubernetes.io/projected/d50ee3b1-4f97-4644-802d-04c85d9c3abc-kube-api-access-ffh47\") pod \"ovn-controller-mpvqc\" (UID: \"d50ee3b1-4f97-4644-802d-04c85d9c3abc\") " pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.500940 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4sph\" (UniqueName: \"kubernetes.io/projected/22d78adc-2ff6-4f03-b60e-ac8e9a0f3699-kube-api-access-p4sph\") pod \"ovn-controller-ovs-nfzrr\" (UID: \"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699\") " pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.513742 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:31 crc kubenswrapper[4792]: I0301 09:26:31.556512 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:33 crc kubenswrapper[4792]: I0301 09:26:33.896758 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 01 09:26:33 crc kubenswrapper[4792]: I0301 09:26:33.898698 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:33 crc kubenswrapper[4792]: I0301 09:26:33.905325 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 01 09:26:33 crc kubenswrapper[4792]: I0301 09:26:33.905449 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 01 09:26:33 crc kubenswrapper[4792]: I0301 09:26:33.906084 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 01 09:26:33 crc kubenswrapper[4792]: I0301 09:26:33.906363 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-5fgfl" Mar 01 09:26:33 crc kubenswrapper[4792]: I0301 09:26:33.909148 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 01 09:26:33 crc kubenswrapper[4792]: I0301 09:26:33.909442 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.016367 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a20f7417-3c04-411a-88b9-d60664faaee3-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.016416 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a20f7417-3c04-411a-88b9-d60664faaee3-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.016444 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a20f7417-3c04-411a-88b9-d60664faaee3-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.016586 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a20f7417-3c04-411a-88b9-d60664faaee3-config\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.016743 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psjgb\" (UniqueName: \"kubernetes.io/projected/a20f7417-3c04-411a-88b9-d60664faaee3-kube-api-access-psjgb\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.016819 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.016872 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a20f7417-3c04-411a-88b9-d60664faaee3-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.016964 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a20f7417-3c04-411a-88b9-d60664faaee3-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.124792 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psjgb\" (UniqueName: \"kubernetes.io/projected/a20f7417-3c04-411a-88b9-d60664faaee3-kube-api-access-psjgb\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.124863 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.124895 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a20f7417-3c04-411a-88b9-d60664faaee3-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.124952 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a20f7417-3c04-411a-88b9-d60664faaee3-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.125008 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a20f7417-3c04-411a-88b9-d60664faaee3-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.125026 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a20f7417-3c04-411a-88b9-d60664faaee3-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.125041 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a20f7417-3c04-411a-88b9-d60664faaee3-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.125062 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a20f7417-3c04-411a-88b9-d60664faaee3-config\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.126255 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a20f7417-3c04-411a-88b9-d60664faaee3-config\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.126760 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a20f7417-3c04-411a-88b9-d60664faaee3-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.127012 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.128345 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a20f7417-3c04-411a-88b9-d60664faaee3-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.132454 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a20f7417-3c04-411a-88b9-d60664faaee3-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.142797 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a20f7417-3c04-411a-88b9-d60664faaee3-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.145363 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a20f7417-3c04-411a-88b9-d60664faaee3-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.149355 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psjgb\" (UniqueName: \"kubernetes.io/projected/a20f7417-3c04-411a-88b9-d60664faaee3-kube-api-access-psjgb\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.150552 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"a20f7417-3c04-411a-88b9-d60664faaee3\") " pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.228938 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.583811 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.585026 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.592224 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.592273 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.593437 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.593815 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-zmrfb" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.594122 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.733507 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c9312b5-705e-42f0-8462-62c8fdeb0791-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.733549 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c9312b5-705e-42f0-8462-62c8fdeb0791-config\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.733595 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c9312b5-705e-42f0-8462-62c8fdeb0791-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.733614 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkgrk\" (UniqueName: \"kubernetes.io/projected/2c9312b5-705e-42f0-8462-62c8fdeb0791-kube-api-access-vkgrk\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.733662 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2c9312b5-705e-42f0-8462-62c8fdeb0791-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.733687 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c9312b5-705e-42f0-8462-62c8fdeb0791-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.733703 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.733723 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c9312b5-705e-42f0-8462-62c8fdeb0791-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.835344 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2c9312b5-705e-42f0-8462-62c8fdeb0791-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.835400 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.835417 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c9312b5-705e-42f0-8462-62c8fdeb0791-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.835438 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c9312b5-705e-42f0-8462-62c8fdeb0791-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.835478 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c9312b5-705e-42f0-8462-62c8fdeb0791-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.835497 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c9312b5-705e-42f0-8462-62c8fdeb0791-config\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.835534 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c9312b5-705e-42f0-8462-62c8fdeb0791-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.835549 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkgrk\" (UniqueName: \"kubernetes.io/projected/2c9312b5-705e-42f0-8462-62c8fdeb0791-kube-api-access-vkgrk\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.836293 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.837236 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2c9312b5-705e-42f0-8462-62c8fdeb0791-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.837531 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c9312b5-705e-42f0-8462-62c8fdeb0791-config\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.838766 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c9312b5-705e-42f0-8462-62c8fdeb0791-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.839803 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c9312b5-705e-42f0-8462-62c8fdeb0791-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.840058 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c9312b5-705e-42f0-8462-62c8fdeb0791-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.848721 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c9312b5-705e-42f0-8462-62c8fdeb0791-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.852510 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkgrk\" (UniqueName: \"kubernetes.io/projected/2c9312b5-705e-42f0-8462-62c8fdeb0791-kube-api-access-vkgrk\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.870083 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2c9312b5-705e-42f0-8462-62c8fdeb0791\") " pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.920547 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.943958 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:26:34 crc kubenswrapper[4792]: I0301 09:26:34.944004 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:26:49 crc kubenswrapper[4792]: E0301 09:26:49.056283 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 01 09:26:49 crc kubenswrapper[4792]: E0301 09:26:49.056997 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6fwv9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7c47bcb9f9-xrm9m_openstack(1ebfee14-6440-476a-87ff-fc933df3eaa8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:26:49 crc kubenswrapper[4792]: E0301 09:26:49.058222 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-7c47bcb9f9-xrm9m" podUID="1ebfee14-6440-476a-87ff-fc933df3eaa8" Mar 01 09:26:49 crc kubenswrapper[4792]: E0301 09:26:49.102771 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 01 09:26:49 crc kubenswrapper[4792]: E0301 09:26:49.103090 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ws769,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78cb4465c9-czwcl_openstack(f8a60915-b57b-4331-ad0e-b671ff576a69): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:26:49 crc kubenswrapper[4792]: E0301 09:26:49.104495 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78cb4465c9-czwcl" podUID="f8a60915-b57b-4331-ad0e-b671ff576a69" Mar 01 09:26:49 crc kubenswrapper[4792]: E0301 09:26:49.126616 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 01 09:26:49 crc kubenswrapper[4792]: E0301 09:26:49.126785 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cczf7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-589db6c89c-sqrnc_openstack(40e11436-ae27-48c0-8baf-3f0aecc8e73c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:26:49 crc kubenswrapper[4792]: E0301 09:26:49.127536 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 01 09:26:49 crc kubenswrapper[4792]: E0301 09:26:49.127704 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q27bj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-86bbd886cf-brcdx_openstack(bf78c5d7-c647-4862-bc7b-e14f8de9ef0f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:26:49 crc kubenswrapper[4792]: E0301 09:26:49.128898 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-589db6c89c-sqrnc" podUID="40e11436-ae27-48c0-8baf-3f0aecc8e73c" Mar 01 09:26:49 crc kubenswrapper[4792]: E0301 09:26:49.129199 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-86bbd886cf-brcdx" podUID="bf78c5d7-c647-4862-bc7b-e14f8de9ef0f" Mar 01 09:26:49 crc kubenswrapper[4792]: I0301 09:26:49.598338 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mpvqc"] Mar 01 09:26:49 crc kubenswrapper[4792]: W0301 09:26:49.946024 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd50ee3b1_4f97_4644_802d_04c85d9c3abc.slice/crio-2e6843120ab5bdcf9f58b38c0a595cc6cbfe7fb2ddd2cd6a13fd86ac14443855 WatchSource:0}: Error finding container 2e6843120ab5bdcf9f58b38c0a595cc6cbfe7fb2ddd2cd6a13fd86ac14443855: Status 404 returned error can't find the container with id 2e6843120ab5bdcf9f58b38c0a595cc6cbfe7fb2ddd2cd6a13fd86ac14443855 Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.012688 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.032773 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b969e6eb-14a7-4e45-8342-ccbd05c06261","Type":"ContainerStarted","Data":"8454d15743412518e314144a6752ae268a6efc96deec97d3b909bd1028bb3b9b"} Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.037179 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mpvqc" event={"ID":"d50ee3b1-4f97-4644-802d-04c85d9c3abc","Type":"ContainerStarted","Data":"2e6843120ab5bdcf9f58b38c0a595cc6cbfe7fb2ddd2cd6a13fd86ac14443855"} Mar 01 09:26:50 crc kubenswrapper[4792]: E0301 09:26:50.038537 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514\\\"\"" pod="openstack/dnsmasq-dns-7c47bcb9f9-xrm9m" podUID="1ebfee14-6440-476a-87ff-fc933df3eaa8" Mar 01 09:26:50 crc kubenswrapper[4792]: E0301 09:26:50.038714 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514\\\"\"" pod="openstack/dnsmasq-dns-78cb4465c9-czwcl" podUID="f8a60915-b57b-4331-ad0e-b671ff576a69" Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.517412 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-brcdx" Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.567383 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-sqrnc" Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.613235 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f-dns-svc\") pod \"bf78c5d7-c647-4862-bc7b-e14f8de9ef0f\" (UID: \"bf78c5d7-c647-4862-bc7b-e14f8de9ef0f\") " Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.613414 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f-config\") pod \"bf78c5d7-c647-4862-bc7b-e14f8de9ef0f\" (UID: \"bf78c5d7-c647-4862-bc7b-e14f8de9ef0f\") " Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.613866 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f-config" (OuterVolumeSpecName: "config") pod "bf78c5d7-c647-4862-bc7b-e14f8de9ef0f" (UID: "bf78c5d7-c647-4862-bc7b-e14f8de9ef0f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.613859 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bf78c5d7-c647-4862-bc7b-e14f8de9ef0f" (UID: "bf78c5d7-c647-4862-bc7b-e14f8de9ef0f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.614019 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q27bj\" (UniqueName: \"kubernetes.io/projected/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f-kube-api-access-q27bj\") pod \"bf78c5d7-c647-4862-bc7b-e14f8de9ef0f\" (UID: \"bf78c5d7-c647-4862-bc7b-e14f8de9ef0f\") " Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.614503 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.614523 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.620608 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f-kube-api-access-q27bj" (OuterVolumeSpecName: "kube-api-access-q27bj") pod "bf78c5d7-c647-4862-bc7b-e14f8de9ef0f" (UID: "bf78c5d7-c647-4862-bc7b-e14f8de9ef0f"). InnerVolumeSpecName "kube-api-access-q27bj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.715433 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cczf7\" (UniqueName: \"kubernetes.io/projected/40e11436-ae27-48c0-8baf-3f0aecc8e73c-kube-api-access-cczf7\") pod \"40e11436-ae27-48c0-8baf-3f0aecc8e73c\" (UID: \"40e11436-ae27-48c0-8baf-3f0aecc8e73c\") " Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.715560 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40e11436-ae27-48c0-8baf-3f0aecc8e73c-config\") pod \"40e11436-ae27-48c0-8baf-3f0aecc8e73c\" (UID: \"40e11436-ae27-48c0-8baf-3f0aecc8e73c\") " Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.715898 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q27bj\" (UniqueName: \"kubernetes.io/projected/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f-kube-api-access-q27bj\") on node \"crc\" DevicePath \"\"" Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.716311 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40e11436-ae27-48c0-8baf-3f0aecc8e73c-config" (OuterVolumeSpecName: "config") pod "40e11436-ae27-48c0-8baf-3f0aecc8e73c" (UID: "40e11436-ae27-48c0-8baf-3f0aecc8e73c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.720225 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40e11436-ae27-48c0-8baf-3f0aecc8e73c-kube-api-access-cczf7" (OuterVolumeSpecName: "kube-api-access-cczf7") pod "40e11436-ae27-48c0-8baf-3f0aecc8e73c" (UID: "40e11436-ae27-48c0-8baf-3f0aecc8e73c"). InnerVolumeSpecName "kube-api-access-cczf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.801322 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-nfzrr"] Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.817356 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40e11436-ae27-48c0-8baf-3f0aecc8e73c-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.817393 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cczf7\" (UniqueName: \"kubernetes.io/projected/40e11436-ae27-48c0-8baf-3f0aecc8e73c-kube-api-access-cczf7\") on node \"crc\" DevicePath \"\"" Mar 01 09:26:50 crc kubenswrapper[4792]: I0301 09:26:50.973637 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 01 09:26:51 crc kubenswrapper[4792]: I0301 09:26:51.043947 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2c9312b5-705e-42f0-8462-62c8fdeb0791","Type":"ContainerStarted","Data":"7a6c996b1e0956eceabbb1bb33423f1746d6ef7cb8bf6ead0556fc4690b162c1"} Mar 01 09:26:51 crc kubenswrapper[4792]: I0301 09:26:51.045489 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6252a079-917c-46e8-a848-10569e1e057e","Type":"ContainerStarted","Data":"81c1c2615bd05b6e2f8a23b6d892f8335b3c7a5c117575ce3ed245f2faa7543f"} Mar 01 09:26:51 crc kubenswrapper[4792]: I0301 09:26:51.048474 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f2d03d42-7830-444b-a8ae-c91e16d352b9","Type":"ContainerStarted","Data":"b1b2332536ff5b014987937edb8cc1b3217d724f3d46eed3bc72b1534e0bed78"} Mar 01 09:26:51 crc kubenswrapper[4792]: I0301 09:26:51.050860 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"84d455ad-7bbb-4771-a8ed-9aa1984e1d40","Type":"ContainerStarted","Data":"f83e1270986c4d5bf17f9e23a710fe04a8f8a3a1361ff4a562a5dbb07885c3b0"} Mar 01 09:26:51 crc kubenswrapper[4792]: I0301 09:26:51.051058 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 01 09:26:51 crc kubenswrapper[4792]: I0301 09:26:51.053009 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bbd886cf-brcdx" event={"ID":"bf78c5d7-c647-4862-bc7b-e14f8de9ef0f","Type":"ContainerDied","Data":"8e90037762d911668e790efa9d44feb3d2539a5a373a55b8095279629113a42e"} Mar 01 09:26:51 crc kubenswrapper[4792]: I0301 09:26:51.053113 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-brcdx" Mar 01 09:26:51 crc kubenswrapper[4792]: I0301 09:26:51.053976 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589db6c89c-sqrnc" event={"ID":"40e11436-ae27-48c0-8baf-3f0aecc8e73c","Type":"ContainerDied","Data":"89744d9bb053371289c3f391707b9cba897bfe50eed733d33f5f66e5f68036ab"} Mar 01 09:26:51 crc kubenswrapper[4792]: I0301 09:26:51.054288 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-sqrnc" Mar 01 09:26:51 crc kubenswrapper[4792]: I0301 09:26:51.114440 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=3.699907571 podStartE2EDuration="26.114422572s" podCreationTimestamp="2026-03-01 09:26:25 +0000 UTC" firstStartedPulling="2026-03-01 09:26:26.715992601 +0000 UTC m=+1115.957871798" lastFinishedPulling="2026-03-01 09:26:49.130507602 +0000 UTC m=+1138.372386799" observedRunningTime="2026-03-01 09:26:51.109522655 +0000 UTC m=+1140.351401862" watchObservedRunningTime="2026-03-01 09:26:51.114422572 +0000 UTC m=+1140.356301769" Mar 01 09:26:51 crc kubenswrapper[4792]: I0301 09:26:51.162616 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-sqrnc"] Mar 01 09:26:51 crc kubenswrapper[4792]: I0301 09:26:51.168517 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-sqrnc"] Mar 01 09:26:51 crc kubenswrapper[4792]: I0301 09:26:51.201939 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-brcdx"] Mar 01 09:26:51 crc kubenswrapper[4792]: I0301 09:26:51.204594 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-brcdx"] Mar 01 09:26:51 crc kubenswrapper[4792]: I0301 09:26:51.441223 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40e11436-ae27-48c0-8baf-3f0aecc8e73c" path="/var/lib/kubelet/pods/40e11436-ae27-48c0-8baf-3f0aecc8e73c/volumes" Mar 01 09:26:51 crc kubenswrapper[4792]: I0301 09:26:51.445812 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf78c5d7-c647-4862-bc7b-e14f8de9ef0f" path="/var/lib/kubelet/pods/bf78c5d7-c647-4862-bc7b-e14f8de9ef0f/volumes" Mar 01 09:26:51 crc kubenswrapper[4792]: W0301 09:26:51.658837 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22d78adc_2ff6_4f03_b60e_ac8e9a0f3699.slice/crio-8302b894750954038bb6748f427e127e1b452eb998c2786a648ff5cb7f54a541 WatchSource:0}: Error finding container 8302b894750954038bb6748f427e127e1b452eb998c2786a648ff5cb7f54a541: Status 404 returned error can't find the container with id 8302b894750954038bb6748f427e127e1b452eb998c2786a648ff5cb7f54a541 Mar 01 09:26:51 crc kubenswrapper[4792]: W0301 09:26:51.662531 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda20f7417_3c04_411a_88b9_d60664faaee3.slice/crio-99a86d134f213914eac943cfcb04a2986a73b0106133daa7ff5aafdeef2850cf WatchSource:0}: Error finding container 99a86d134f213914eac943cfcb04a2986a73b0106133daa7ff5aafdeef2850cf: Status 404 returned error can't find the container with id 99a86d134f213914eac943cfcb04a2986a73b0106133daa7ff5aafdeef2850cf Mar 01 09:26:52 crc kubenswrapper[4792]: I0301 09:26:52.062038 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"a20f7417-3c04-411a-88b9-d60664faaee3","Type":"ContainerStarted","Data":"99a86d134f213914eac943cfcb04a2986a73b0106133daa7ff5aafdeef2850cf"} Mar 01 09:26:52 crc kubenswrapper[4792]: I0301 09:26:52.064111 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c5db40bf-18aa-4877-ad92-35d50c549309","Type":"ContainerStarted","Data":"ab032429016daa4de09e0b350e6149a60f82fbef10f58da185abc90fde56991b"} Mar 01 09:26:52 crc kubenswrapper[4792]: I0301 09:26:52.064375 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 01 09:26:52 crc kubenswrapper[4792]: I0301 09:26:52.067581 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24","Type":"ContainerStarted","Data":"6ee0d7e342e549ff7a0c75a829bbad1f0458089ec98c553779c523b140c0f36b"} Mar 01 09:26:52 crc kubenswrapper[4792]: I0301 09:26:52.069529 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-nfzrr" event={"ID":"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699","Type":"ContainerStarted","Data":"8302b894750954038bb6748f427e127e1b452eb998c2786a648ff5cb7f54a541"} Mar 01 09:26:52 crc kubenswrapper[4792]: I0301 09:26:52.087655 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.474786667 podStartE2EDuration="24.087639814s" podCreationTimestamp="2026-03-01 09:26:28 +0000 UTC" firstStartedPulling="2026-03-01 09:26:29.099865842 +0000 UTC m=+1118.341745029" lastFinishedPulling="2026-03-01 09:26:51.712718969 +0000 UTC m=+1140.954598176" observedRunningTime="2026-03-01 09:26:52.08033771 +0000 UTC m=+1141.322216927" watchObservedRunningTime="2026-03-01 09:26:52.087639814 +0000 UTC m=+1141.329519011" Mar 01 09:26:53 crc kubenswrapper[4792]: I0301 09:26:53.076976 4792 generic.go:334] "Generic (PLEG): container finished" podID="b969e6eb-14a7-4e45-8342-ccbd05c06261" containerID="8454d15743412518e314144a6752ae268a6efc96deec97d3b909bd1028bb3b9b" exitCode=0 Mar 01 09:26:53 crc kubenswrapper[4792]: I0301 09:26:53.078296 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b969e6eb-14a7-4e45-8342-ccbd05c06261","Type":"ContainerDied","Data":"8454d15743412518e314144a6752ae268a6efc96deec97d3b909bd1028bb3b9b"} Mar 01 09:26:54 crc kubenswrapper[4792]: I0301 09:26:54.090670 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b969e6eb-14a7-4e45-8342-ccbd05c06261","Type":"ContainerStarted","Data":"527f7c5f55b21db39864a6aeed6e8a3a82f2058bad8f4f37fe92df1ddfa952ac"} Mar 01 09:26:54 crc kubenswrapper[4792]: I0301 09:26:54.096958 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mpvqc" event={"ID":"d50ee3b1-4f97-4644-802d-04c85d9c3abc","Type":"ContainerStarted","Data":"4c150d0b8509b0afa6465f1ae370aea56d239728b3bbd15c71f0d2de67de9cb9"} Mar 01 09:26:54 crc kubenswrapper[4792]: I0301 09:26:54.097090 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-mpvqc" Mar 01 09:26:54 crc kubenswrapper[4792]: I0301 09:26:54.098593 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-nfzrr" event={"ID":"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699","Type":"ContainerStarted","Data":"ae8e8a3920d3b3061bc4fa93f61aeb832c261d8d34750eb1a7881044ff0f527a"} Mar 01 09:26:54 crc kubenswrapper[4792]: I0301 09:26:54.120650 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=8.019725942 podStartE2EDuration="32.120628322s" podCreationTimestamp="2026-03-01 09:26:22 +0000 UTC" firstStartedPulling="2026-03-01 09:26:25.030789221 +0000 UTC m=+1114.272668418" lastFinishedPulling="2026-03-01 09:26:49.131691601 +0000 UTC m=+1138.373570798" observedRunningTime="2026-03-01 09:26:54.109678421 +0000 UTC m=+1143.351557618" watchObservedRunningTime="2026-03-01 09:26:54.120628322 +0000 UTC m=+1143.362507519" Mar 01 09:26:54 crc kubenswrapper[4792]: I0301 09:26:54.137297 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-mpvqc" podStartSLOduration=19.383765249 podStartE2EDuration="23.137276859s" podCreationTimestamp="2026-03-01 09:26:31 +0000 UTC" firstStartedPulling="2026-03-01 09:26:49.951478647 +0000 UTC m=+1139.193357834" lastFinishedPulling="2026-03-01 09:26:53.704990247 +0000 UTC m=+1142.946869444" observedRunningTime="2026-03-01 09:26:54.129376191 +0000 UTC m=+1143.371255378" watchObservedRunningTime="2026-03-01 09:26:54.137276859 +0000 UTC m=+1143.379156056" Mar 01 09:26:54 crc kubenswrapper[4792]: I0301 09:26:54.334927 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 01 09:26:54 crc kubenswrapper[4792]: I0301 09:26:54.334974 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 01 09:26:55 crc kubenswrapper[4792]: I0301 09:26:55.107971 4792 generic.go:334] "Generic (PLEG): container finished" podID="22d78adc-2ff6-4f03-b60e-ac8e9a0f3699" containerID="ae8e8a3920d3b3061bc4fa93f61aeb832c261d8d34750eb1a7881044ff0f527a" exitCode=0 Mar 01 09:26:55 crc kubenswrapper[4792]: I0301 09:26:55.108041 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-nfzrr" event={"ID":"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699","Type":"ContainerDied","Data":"ae8e8a3920d3b3061bc4fa93f61aeb832c261d8d34750eb1a7881044ff0f527a"} Mar 01 09:26:55 crc kubenswrapper[4792]: I0301 09:26:55.111934 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"a20f7417-3c04-411a-88b9-d60664faaee3","Type":"ContainerStarted","Data":"277b3aeef07b3328e0c9713011a75e57d246f6728cbc9d042c8b0f4f1f114f8b"} Mar 01 09:26:55 crc kubenswrapper[4792]: I0301 09:26:55.116938 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2c9312b5-705e-42f0-8462-62c8fdeb0791","Type":"ContainerStarted","Data":"4f34373b18e0c92342a207d871ff46d575d902ff599d1c4b8091120a52e3b9a0"} Mar 01 09:26:55 crc kubenswrapper[4792]: I0301 09:26:55.120500 4792 generic.go:334] "Generic (PLEG): container finished" podID="f2d03d42-7830-444b-a8ae-c91e16d352b9" containerID="b1b2332536ff5b014987937edb8cc1b3217d724f3d46eed3bc72b1534e0bed78" exitCode=0 Mar 01 09:26:55 crc kubenswrapper[4792]: I0301 09:26:55.120596 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f2d03d42-7830-444b-a8ae-c91e16d352b9","Type":"ContainerDied","Data":"b1b2332536ff5b014987937edb8cc1b3217d724f3d46eed3bc72b1534e0bed78"} Mar 01 09:26:56 crc kubenswrapper[4792]: I0301 09:26:56.026488 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 01 09:26:56 crc kubenswrapper[4792]: I0301 09:26:56.136788 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-nfzrr" event={"ID":"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699","Type":"ContainerStarted","Data":"79c70ae2f20509a4f11e5f4d26c6756782d88cbdf2b0efc87c5a966cc1e41b31"} Mar 01 09:26:56 crc kubenswrapper[4792]: I0301 09:26:56.136869 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-nfzrr" event={"ID":"22d78adc-2ff6-4f03-b60e-ac8e9a0f3699","Type":"ContainerStarted","Data":"da0e79386e4a5ecd379d8a12d635afee3fe617678251383677aa69373b3f0009"} Mar 01 09:26:56 crc kubenswrapper[4792]: I0301 09:26:56.136945 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:56 crc kubenswrapper[4792]: I0301 09:26:56.142282 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f2d03d42-7830-444b-a8ae-c91e16d352b9","Type":"ContainerStarted","Data":"09bcfb4050ee82529f2845f71f1c73d4c107dbe358b73d31f7dcff55d2806351"} Mar 01 09:26:56 crc kubenswrapper[4792]: I0301 09:26:56.168175 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-nfzrr" podStartSLOduration=23.12564547 podStartE2EDuration="25.168151116s" podCreationTimestamp="2026-03-01 09:26:31 +0000 UTC" firstStartedPulling="2026-03-01 09:26:51.662721917 +0000 UTC m=+1140.904601114" lastFinishedPulling="2026-03-01 09:26:53.705227563 +0000 UTC m=+1142.947106760" observedRunningTime="2026-03-01 09:26:56.156807156 +0000 UTC m=+1145.398686353" watchObservedRunningTime="2026-03-01 09:26:56.168151116 +0000 UTC m=+1145.410030323" Mar 01 09:26:56 crc kubenswrapper[4792]: I0301 09:26:56.185855 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=9.497912282 podStartE2EDuration="32.185833308s" podCreationTimestamp="2026-03-01 09:26:24 +0000 UTC" firstStartedPulling="2026-03-01 09:26:26.46917743 +0000 UTC m=+1115.711056627" lastFinishedPulling="2026-03-01 09:26:49.157098456 +0000 UTC m=+1138.398977653" observedRunningTime="2026-03-01 09:26:56.181059744 +0000 UTC m=+1145.422938941" watchObservedRunningTime="2026-03-01 09:26:56.185833308 +0000 UTC m=+1145.427712515" Mar 01 09:26:56 crc kubenswrapper[4792]: I0301 09:26:56.556926 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:26:58 crc kubenswrapper[4792]: I0301 09:26:58.156794 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"a20f7417-3c04-411a-88b9-d60664faaee3","Type":"ContainerStarted","Data":"9ac8557067dcd7b2193acc9fb67f687c52d1c8c861acf95a5d0d3d85d48e58ad"} Mar 01 09:26:58 crc kubenswrapper[4792]: I0301 09:26:58.158711 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2c9312b5-705e-42f0-8462-62c8fdeb0791","Type":"ContainerStarted","Data":"7319fba5c487cb4e2250ecbe1f38126ed3f63bfddd53b74bfe7cd6a648f825f9"} Mar 01 09:26:58 crc kubenswrapper[4792]: I0301 09:26:58.181193 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=20.15292734 podStartE2EDuration="26.18117766s" podCreationTimestamp="2026-03-01 09:26:32 +0000 UTC" firstStartedPulling="2026-03-01 09:26:51.665544545 +0000 UTC m=+1140.907423742" lastFinishedPulling="2026-03-01 09:26:57.693794875 +0000 UTC m=+1146.935674062" observedRunningTime="2026-03-01 09:26:58.179367216 +0000 UTC m=+1147.421246413" watchObservedRunningTime="2026-03-01 09:26:58.18117766 +0000 UTC m=+1147.423056847" Mar 01 09:26:58 crc kubenswrapper[4792]: I0301 09:26:58.210260 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=17.717987374 podStartE2EDuration="25.210243762s" podCreationTimestamp="2026-03-01 09:26:33 +0000 UTC" firstStartedPulling="2026-03-01 09:26:50.191856546 +0000 UTC m=+1139.433735743" lastFinishedPulling="2026-03-01 09:26:57.684112934 +0000 UTC m=+1146.925992131" observedRunningTime="2026-03-01 09:26:58.203355878 +0000 UTC m=+1147.445235075" watchObservedRunningTime="2026-03-01 09:26:58.210243762 +0000 UTC m=+1147.452122959" Mar 01 09:26:58 crc kubenswrapper[4792]: I0301 09:26:58.229352 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:58 crc kubenswrapper[4792]: I0301 09:26:58.272043 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:58 crc kubenswrapper[4792]: I0301 09:26:58.423172 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 01 09:26:58 crc kubenswrapper[4792]: I0301 09:26:58.507947 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 01 09:26:58 crc kubenswrapper[4792]: I0301 09:26:58.556535 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 01 09:26:58 crc kubenswrapper[4792]: I0301 09:26:58.920779 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:58 crc kubenswrapper[4792]: I0301 09:26:58.957451 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.167094 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.167136 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.202815 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.208548 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.553628 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-czwcl"] Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.623058 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-j6c8g"] Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.632497 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.638740 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.641295 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-j6c8g"] Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.666738 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-7wc55"] Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.667690 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.672171 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.683878 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-7wc55"] Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.763688 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9493aff0-58e3-44ca-ba01-69f3b284d732-ovs-rundir\") pod \"ovn-controller-metrics-7wc55\" (UID: \"9493aff0-58e3-44ca-ba01-69f3b284d732\") " pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.763738 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87678b56-0909-4735-ad6b-cb992dc86853-dns-svc\") pod \"dnsmasq-dns-6444958b7f-j6c8g\" (UID: \"87678b56-0909-4735-ad6b-cb992dc86853\") " pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.763762 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9493aff0-58e3-44ca-ba01-69f3b284d732-combined-ca-bundle\") pod \"ovn-controller-metrics-7wc55\" (UID: \"9493aff0-58e3-44ca-ba01-69f3b284d732\") " pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.763783 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9493aff0-58e3-44ca-ba01-69f3b284d732-config\") pod \"ovn-controller-metrics-7wc55\" (UID: \"9493aff0-58e3-44ca-ba01-69f3b284d732\") " pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.763825 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9493aff0-58e3-44ca-ba01-69f3b284d732-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-7wc55\" (UID: \"9493aff0-58e3-44ca-ba01-69f3b284d732\") " pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.763856 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzbw9\" (UniqueName: \"kubernetes.io/projected/87678b56-0909-4735-ad6b-cb992dc86853-kube-api-access-tzbw9\") pod \"dnsmasq-dns-6444958b7f-j6c8g\" (UID: \"87678b56-0909-4735-ad6b-cb992dc86853\") " pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.763876 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wx4d\" (UniqueName: \"kubernetes.io/projected/9493aff0-58e3-44ca-ba01-69f3b284d732-kube-api-access-6wx4d\") pod \"ovn-controller-metrics-7wc55\" (UID: \"9493aff0-58e3-44ca-ba01-69f3b284d732\") " pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.763900 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9493aff0-58e3-44ca-ba01-69f3b284d732-ovn-rundir\") pod \"ovn-controller-metrics-7wc55\" (UID: \"9493aff0-58e3-44ca-ba01-69f3b284d732\") " pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.763965 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87678b56-0909-4735-ad6b-cb992dc86853-config\") pod \"dnsmasq-dns-6444958b7f-j6c8g\" (UID: \"87678b56-0909-4735-ad6b-cb992dc86853\") " pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.763995 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87678b56-0909-4735-ad6b-cb992dc86853-ovsdbserver-nb\") pod \"dnsmasq-dns-6444958b7f-j6c8g\" (UID: \"87678b56-0909-4735-ad6b-cb992dc86853\") " pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.850226 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-xrm9m"] Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.861582 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.865009 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.865029 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87678b56-0909-4735-ad6b-cb992dc86853-dns-svc\") pod \"dnsmasq-dns-6444958b7f-j6c8g\" (UID: \"87678b56-0909-4735-ad6b-cb992dc86853\") " pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.865076 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9493aff0-58e3-44ca-ba01-69f3b284d732-combined-ca-bundle\") pod \"ovn-controller-metrics-7wc55\" (UID: \"9493aff0-58e3-44ca-ba01-69f3b284d732\") " pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.865097 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9493aff0-58e3-44ca-ba01-69f3b284d732-config\") pod \"ovn-controller-metrics-7wc55\" (UID: \"9493aff0-58e3-44ca-ba01-69f3b284d732\") " pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.865142 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9493aff0-58e3-44ca-ba01-69f3b284d732-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-7wc55\" (UID: \"9493aff0-58e3-44ca-ba01-69f3b284d732\") " pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.865182 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzbw9\" (UniqueName: \"kubernetes.io/projected/87678b56-0909-4735-ad6b-cb992dc86853-kube-api-access-tzbw9\") pod \"dnsmasq-dns-6444958b7f-j6c8g\" (UID: \"87678b56-0909-4735-ad6b-cb992dc86853\") " pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.865200 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wx4d\" (UniqueName: \"kubernetes.io/projected/9493aff0-58e3-44ca-ba01-69f3b284d732-kube-api-access-6wx4d\") pod \"ovn-controller-metrics-7wc55\" (UID: \"9493aff0-58e3-44ca-ba01-69f3b284d732\") " pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.865215 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9493aff0-58e3-44ca-ba01-69f3b284d732-ovn-rundir\") pod \"ovn-controller-metrics-7wc55\" (UID: \"9493aff0-58e3-44ca-ba01-69f3b284d732\") " pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.865270 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87678b56-0909-4735-ad6b-cb992dc86853-config\") pod \"dnsmasq-dns-6444958b7f-j6c8g\" (UID: \"87678b56-0909-4735-ad6b-cb992dc86853\") " pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.865302 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87678b56-0909-4735-ad6b-cb992dc86853-ovsdbserver-nb\") pod \"dnsmasq-dns-6444958b7f-j6c8g\" (UID: \"87678b56-0909-4735-ad6b-cb992dc86853\") " pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.865321 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9493aff0-58e3-44ca-ba01-69f3b284d732-ovs-rundir\") pod \"ovn-controller-metrics-7wc55\" (UID: \"9493aff0-58e3-44ca-ba01-69f3b284d732\") " pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.865703 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9493aff0-58e3-44ca-ba01-69f3b284d732-ovn-rundir\") pod \"ovn-controller-metrics-7wc55\" (UID: \"9493aff0-58e3-44ca-ba01-69f3b284d732\") " pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.866409 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9493aff0-58e3-44ca-ba01-69f3b284d732-config\") pod \"ovn-controller-metrics-7wc55\" (UID: \"9493aff0-58e3-44ca-ba01-69f3b284d732\") " pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.866435 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9493aff0-58e3-44ca-ba01-69f3b284d732-ovs-rundir\") pod \"ovn-controller-metrics-7wc55\" (UID: \"9493aff0-58e3-44ca-ba01-69f3b284d732\") " pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.866488 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87678b56-0909-4735-ad6b-cb992dc86853-config\") pod \"dnsmasq-dns-6444958b7f-j6c8g\" (UID: \"87678b56-0909-4735-ad6b-cb992dc86853\") " pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.866717 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87678b56-0909-4735-ad6b-cb992dc86853-ovsdbserver-nb\") pod \"dnsmasq-dns-6444958b7f-j6c8g\" (UID: \"87678b56-0909-4735-ad6b-cb992dc86853\") " pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.874643 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9493aff0-58e3-44ca-ba01-69f3b284d732-combined-ca-bundle\") pod \"ovn-controller-metrics-7wc55\" (UID: \"9493aff0-58e3-44ca-ba01-69f3b284d732\") " pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.887331 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.887525 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.887633 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.887732 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-f4phx" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.888710 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9493aff0-58e3-44ca-ba01-69f3b284d732-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-7wc55\" (UID: \"9493aff0-58e3-44ca-ba01-69f3b284d732\") " pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.897528 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87678b56-0909-4735-ad6b-cb992dc86853-dns-svc\") pod \"dnsmasq-dns-6444958b7f-j6c8g\" (UID: \"87678b56-0909-4735-ad6b-cb992dc86853\") " pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.925422 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.927569 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wx4d\" (UniqueName: \"kubernetes.io/projected/9493aff0-58e3-44ca-ba01-69f3b284d732-kube-api-access-6wx4d\") pod \"ovn-controller-metrics-7wc55\" (UID: \"9493aff0-58e3-44ca-ba01-69f3b284d732\") " pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.928074 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzbw9\" (UniqueName: \"kubernetes.io/projected/87678b56-0909-4735-ad6b-cb992dc86853-kube-api-access-tzbw9\") pod \"dnsmasq-dns-6444958b7f-j6c8g\" (UID: \"87678b56-0909-4735-ad6b-cb992dc86853\") " pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.946548 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-mz86z"] Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.956582 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.968608 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.972153 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.980944 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1712e112-23fd-402b-ae0b-f63a594d4fab-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.981061 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1712e112-23fd-402b-ae0b-f63a594d4fab-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.981179 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1712e112-23fd-402b-ae0b-f63a594d4fab-config\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.981279 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1712e112-23fd-402b-ae0b-f63a594d4fab-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.981356 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1712e112-23fd-402b-ae0b-f63a594d4fab-scripts\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.981466 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84hzh\" (UniqueName: \"kubernetes.io/projected/1712e112-23fd-402b-ae0b-f63a594d4fab-kube-api-access-84hzh\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:26:59 crc kubenswrapper[4792]: I0301 09:26:59.981547 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1712e112-23fd-402b-ae0b-f63a594d4fab-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.010230 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-7wc55" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.016227 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-mz86z"] Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.084037 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1712e112-23fd-402b-ae0b-f63a594d4fab-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.084079 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m768\" (UniqueName: \"kubernetes.io/projected/b3682585-f554-4a65-86cb-096243ccc793-kube-api-access-8m768\") pod \"dnsmasq-dns-7b57d9888c-mz86z\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.084113 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1712e112-23fd-402b-ae0b-f63a594d4fab-config\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.084148 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1712e112-23fd-402b-ae0b-f63a594d4fab-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.085181 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1712e112-23fd-402b-ae0b-f63a594d4fab-scripts\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.085257 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-ovsdbserver-sb\") pod \"dnsmasq-dns-7b57d9888c-mz86z\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.085278 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84hzh\" (UniqueName: \"kubernetes.io/projected/1712e112-23fd-402b-ae0b-f63a594d4fab-kube-api-access-84hzh\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.085306 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1712e112-23fd-402b-ae0b-f63a594d4fab-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.085332 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-config\") pod \"dnsmasq-dns-7b57d9888c-mz86z\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.085357 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-dns-svc\") pod \"dnsmasq-dns-7b57d9888c-mz86z\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.085407 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1712e112-23fd-402b-ae0b-f63a594d4fab-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.085424 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-ovsdbserver-nb\") pod \"dnsmasq-dns-7b57d9888c-mz86z\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.088958 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1712e112-23fd-402b-ae0b-f63a594d4fab-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.089458 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1712e112-23fd-402b-ae0b-f63a594d4fab-config\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.089599 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1712e112-23fd-402b-ae0b-f63a594d4fab-scripts\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.094288 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1712e112-23fd-402b-ae0b-f63a594d4fab-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.107558 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1712e112-23fd-402b-ae0b-f63a594d4fab-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.116450 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1712e112-23fd-402b-ae0b-f63a594d4fab-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.123073 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84hzh\" (UniqueName: \"kubernetes.io/projected/1712e112-23fd-402b-ae0b-f63a594d4fab-kube-api-access-84hzh\") pod \"ovn-northd-0\" (UID: \"1712e112-23fd-402b-ae0b-f63a594d4fab\") " pod="openstack/ovn-northd-0" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.187658 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-ovsdbserver-sb\") pod \"dnsmasq-dns-7b57d9888c-mz86z\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.187756 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-config\") pod \"dnsmasq-dns-7b57d9888c-mz86z\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.187793 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-dns-svc\") pod \"dnsmasq-dns-7b57d9888c-mz86z\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.187845 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-ovsdbserver-nb\") pod \"dnsmasq-dns-7b57d9888c-mz86z\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.187882 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m768\" (UniqueName: \"kubernetes.io/projected/b3682585-f554-4a65-86cb-096243ccc793-kube-api-access-8m768\") pod \"dnsmasq-dns-7b57d9888c-mz86z\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.190107 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-ovsdbserver-sb\") pod \"dnsmasq-dns-7b57d9888c-mz86z\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.190786 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-config\") pod \"dnsmasq-dns-7b57d9888c-mz86z\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.191605 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-dns-svc\") pod \"dnsmasq-dns-7b57d9888c-mz86z\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.193636 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-ovsdbserver-nb\") pod \"dnsmasq-dns-7b57d9888c-mz86z\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.202496 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-czwcl" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.214299 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-czwcl" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.214474 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-czwcl" event={"ID":"f8a60915-b57b-4331-ad0e-b671ff576a69","Type":"ContainerDied","Data":"67be12a989d9394e207bd8944b8cd6294460a40d572fd7dffd8e27ec9d363809"} Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.230214 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m768\" (UniqueName: \"kubernetes.io/projected/b3682585-f554-4a65-86cb-096243ccc793-kube-api-access-8m768\") pod \"dnsmasq-dns-7b57d9888c-mz86z\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.290463 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8a60915-b57b-4331-ad0e-b671ff576a69-config\") pod \"f8a60915-b57b-4331-ad0e-b671ff576a69\" (UID: \"f8a60915-b57b-4331-ad0e-b671ff576a69\") " Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.290543 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws769\" (UniqueName: \"kubernetes.io/projected/f8a60915-b57b-4331-ad0e-b671ff576a69-kube-api-access-ws769\") pod \"f8a60915-b57b-4331-ad0e-b671ff576a69\" (UID: \"f8a60915-b57b-4331-ad0e-b671ff576a69\") " Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.290632 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8a60915-b57b-4331-ad0e-b671ff576a69-dns-svc\") pod \"f8a60915-b57b-4331-ad0e-b671ff576a69\" (UID: \"f8a60915-b57b-4331-ad0e-b671ff576a69\") " Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.292519 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8a60915-b57b-4331-ad0e-b671ff576a69-config" (OuterVolumeSpecName: "config") pod "f8a60915-b57b-4331-ad0e-b671ff576a69" (UID: "f8a60915-b57b-4331-ad0e-b671ff576a69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.294037 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8a60915-b57b-4331-ad0e-b671ff576a69-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f8a60915-b57b-4331-ad0e-b671ff576a69" (UID: "f8a60915-b57b-4331-ad0e-b671ff576a69"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.297347 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8a60915-b57b-4331-ad0e-b671ff576a69-kube-api-access-ws769" (OuterVolumeSpecName: "kube-api-access-ws769") pod "f8a60915-b57b-4331-ad0e-b671ff576a69" (UID: "f8a60915-b57b-4331-ad0e-b671ff576a69"). InnerVolumeSpecName "kube-api-access-ws769". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.358728 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-xrm9m" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.361253 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.387307 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.392315 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8a60915-b57b-4331-ad0e-b671ff576a69-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.392354 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ws769\" (UniqueName: \"kubernetes.io/projected/f8a60915-b57b-4331-ad0e-b671ff576a69-kube-api-access-ws769\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.392371 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8a60915-b57b-4331-ad0e-b671ff576a69-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.495699 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ebfee14-6440-476a-87ff-fc933df3eaa8-config\") pod \"1ebfee14-6440-476a-87ff-fc933df3eaa8\" (UID: \"1ebfee14-6440-476a-87ff-fc933df3eaa8\") " Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.495763 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fwv9\" (UniqueName: \"kubernetes.io/projected/1ebfee14-6440-476a-87ff-fc933df3eaa8-kube-api-access-6fwv9\") pod \"1ebfee14-6440-476a-87ff-fc933df3eaa8\" (UID: \"1ebfee14-6440-476a-87ff-fc933df3eaa8\") " Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.495895 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ebfee14-6440-476a-87ff-fc933df3eaa8-dns-svc\") pod \"1ebfee14-6440-476a-87ff-fc933df3eaa8\" (UID: \"1ebfee14-6440-476a-87ff-fc933df3eaa8\") " Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.497744 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ebfee14-6440-476a-87ff-fc933df3eaa8-config" (OuterVolumeSpecName: "config") pod "1ebfee14-6440-476a-87ff-fc933df3eaa8" (UID: "1ebfee14-6440-476a-87ff-fc933df3eaa8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.501267 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ebfee14-6440-476a-87ff-fc933df3eaa8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1ebfee14-6440-476a-87ff-fc933df3eaa8" (UID: "1ebfee14-6440-476a-87ff-fc933df3eaa8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.501471 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ebfee14-6440-476a-87ff-fc933df3eaa8-kube-api-access-6fwv9" (OuterVolumeSpecName: "kube-api-access-6fwv9") pod "1ebfee14-6440-476a-87ff-fc933df3eaa8" (UID: "1ebfee14-6440-476a-87ff-fc933df3eaa8"). InnerVolumeSpecName "kube-api-access-6fwv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.597839 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ebfee14-6440-476a-87ff-fc933df3eaa8-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.597888 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fwv9\" (UniqueName: \"kubernetes.io/projected/1ebfee14-6440-476a-87ff-fc933df3eaa8-kube-api-access-6fwv9\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.597969 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ebfee14-6440-476a-87ff-fc933df3eaa8-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.626313 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-czwcl"] Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.636687 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-czwcl"] Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.662932 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-j6c8g"] Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.797856 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-7wc55"] Mar 01 09:27:00 crc kubenswrapper[4792]: W0301 09:27:00.805604 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9493aff0_58e3_44ca_ba01_69f3b284d732.slice/crio-a74e88581a327876e95868d41b270959ebb242e2f3a2355c320482a146ea207d WatchSource:0}: Error finding container a74e88581a327876e95868d41b270959ebb242e2f3a2355c320482a146ea207d: Status 404 returned error can't find the container with id a74e88581a327876e95868d41b270959ebb242e2f3a2355c320482a146ea207d Mar 01 09:27:00 crc kubenswrapper[4792]: I0301 09:27:00.940061 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 01 09:27:00 crc kubenswrapper[4792]: W0301 09:27:00.963513 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1712e112_23fd_402b_ae0b_f63a594d4fab.slice/crio-ba7f2650ceeaba5235acd15be0999a93cf7db00d9f318c1eb3a0167a08dc809d WatchSource:0}: Error finding container ba7f2650ceeaba5235acd15be0999a93cf7db00d9f318c1eb3a0167a08dc809d: Status 404 returned error can't find the container with id ba7f2650ceeaba5235acd15be0999a93cf7db00d9f318c1eb3a0167a08dc809d Mar 01 09:27:01 crc kubenswrapper[4792]: I0301 09:27:01.043269 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-mz86z"] Mar 01 09:27:01 crc kubenswrapper[4792]: I0301 09:27:01.220099 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"1712e112-23fd-402b-ae0b-f63a594d4fab","Type":"ContainerStarted","Data":"ba7f2650ceeaba5235acd15be0999a93cf7db00d9f318c1eb3a0167a08dc809d"} Mar 01 09:27:01 crc kubenswrapper[4792]: I0301 09:27:01.222305 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-xrm9m" event={"ID":"1ebfee14-6440-476a-87ff-fc933df3eaa8","Type":"ContainerDied","Data":"dfb18f2a46a9a4bd99a6d3ee488952b80257c666cf6b574e91f2c62a39fd84f6"} Mar 01 09:27:01 crc kubenswrapper[4792]: I0301 09:27:01.222458 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-xrm9m" Mar 01 09:27:01 crc kubenswrapper[4792]: I0301 09:27:01.229195 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" event={"ID":"b3682585-f554-4a65-86cb-096243ccc793","Type":"ContainerStarted","Data":"b2e8c06688804bde6de3674de22e963e00a7db65a6fb4924d8a98f95171a76cc"} Mar 01 09:27:01 crc kubenswrapper[4792]: I0301 09:27:01.232719 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" event={"ID":"87678b56-0909-4735-ad6b-cb992dc86853","Type":"ContainerStarted","Data":"0a697580492b727fa9bf9603c48fb17f94e586d5d0286904452488d9188bd582"} Mar 01 09:27:01 crc kubenswrapper[4792]: I0301 09:27:01.234280 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-7wc55" event={"ID":"9493aff0-58e3-44ca-ba01-69f3b284d732","Type":"ContainerStarted","Data":"60455b6869ef76ca128b15b1a396f701387e37d930aa24eb9126981d2885fa5b"} Mar 01 09:27:01 crc kubenswrapper[4792]: I0301 09:27:01.234380 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-7wc55" event={"ID":"9493aff0-58e3-44ca-ba01-69f3b284d732","Type":"ContainerStarted","Data":"a74e88581a327876e95868d41b270959ebb242e2f3a2355c320482a146ea207d"} Mar 01 09:27:01 crc kubenswrapper[4792]: I0301 09:27:01.264684 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-7wc55" podStartSLOduration=2.264664422 podStartE2EDuration="2.264664422s" podCreationTimestamp="2026-03-01 09:26:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:27:01.261369754 +0000 UTC m=+1150.503248951" watchObservedRunningTime="2026-03-01 09:27:01.264664422 +0000 UTC m=+1150.506543619" Mar 01 09:27:01 crc kubenswrapper[4792]: I0301 09:27:01.323270 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-xrm9m"] Mar 01 09:27:01 crc kubenswrapper[4792]: I0301 09:27:01.331475 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-xrm9m"] Mar 01 09:27:01 crc kubenswrapper[4792]: I0301 09:27:01.423846 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ebfee14-6440-476a-87ff-fc933df3eaa8" path="/var/lib/kubelet/pods/1ebfee14-6440-476a-87ff-fc933df3eaa8/volumes" Mar 01 09:27:01 crc kubenswrapper[4792]: I0301 09:27:01.424205 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8a60915-b57b-4331-ad0e-b671ff576a69" path="/var/lib/kubelet/pods/f8a60915-b57b-4331-ad0e-b671ff576a69/volumes" Mar 01 09:27:02 crc kubenswrapper[4792]: I0301 09:27:02.263233 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"1712e112-23fd-402b-ae0b-f63a594d4fab","Type":"ContainerStarted","Data":"b137b70528617dca2a15862de2f905a20234a875aa1eecd21babd85b68e4a3ee"} Mar 01 09:27:02 crc kubenswrapper[4792]: I0301 09:27:02.265229 4792 generic.go:334] "Generic (PLEG): container finished" podID="b3682585-f554-4a65-86cb-096243ccc793" containerID="0ca19db3fcc227c24e95850e002b21aaf1788482cc19ab24284a2d399a8eb0fd" exitCode=0 Mar 01 09:27:02 crc kubenswrapper[4792]: I0301 09:27:02.265283 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" event={"ID":"b3682585-f554-4a65-86cb-096243ccc793","Type":"ContainerDied","Data":"0ca19db3fcc227c24e95850e002b21aaf1788482cc19ab24284a2d399a8eb0fd"} Mar 01 09:27:02 crc kubenswrapper[4792]: I0301 09:27:02.278281 4792 generic.go:334] "Generic (PLEG): container finished" podID="87678b56-0909-4735-ad6b-cb992dc86853" containerID="dda5ed153c6754e09b06fdc62aa62a423747b40a02478faacd4a8e291bb4a8b7" exitCode=0 Mar 01 09:27:02 crc kubenswrapper[4792]: I0301 09:27:02.278356 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" event={"ID":"87678b56-0909-4735-ad6b-cb992dc86853","Type":"ContainerDied","Data":"dda5ed153c6754e09b06fdc62aa62a423747b40a02478faacd4a8e291bb4a8b7"} Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.085284 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-j6lbg"] Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.086497 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-j6lbg" Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.088752 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.103704 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-j6lbg"] Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.242263 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn2gw\" (UniqueName: \"kubernetes.io/projected/77225d3a-9f2d-4aaf-8b98-6dc5310db3da-kube-api-access-tn2gw\") pod \"root-account-create-update-j6lbg\" (UID: \"77225d3a-9f2d-4aaf-8b98-6dc5310db3da\") " pod="openstack/root-account-create-update-j6lbg" Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.242326 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77225d3a-9f2d-4aaf-8b98-6dc5310db3da-operator-scripts\") pod \"root-account-create-update-j6lbg\" (UID: \"77225d3a-9f2d-4aaf-8b98-6dc5310db3da\") " pod="openstack/root-account-create-update-j6lbg" Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.286878 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" event={"ID":"b3682585-f554-4a65-86cb-096243ccc793","Type":"ContainerStarted","Data":"6049c60340fa24dee1fdddb897ca32ccd559a6ca6bdacf1ad0f33b624dbf4865"} Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.286977 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.290133 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" event={"ID":"87678b56-0909-4735-ad6b-cb992dc86853","Type":"ContainerStarted","Data":"145f85c9e4bed69559d3fa2321198a2801d674c147a18cd06a91ee52950e9894"} Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.290300 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.291921 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"1712e112-23fd-402b-ae0b-f63a594d4fab","Type":"ContainerStarted","Data":"23a988a54e4c4461bbed778852e583fffba07392434e115c274bc6841e8ab6bb"} Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.292041 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.307630 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" podStartSLOduration=4.307614137 podStartE2EDuration="4.307614137s" podCreationTimestamp="2026-03-01 09:26:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:27:03.303203942 +0000 UTC m=+1152.545083139" watchObservedRunningTime="2026-03-01 09:27:03.307614137 +0000 UTC m=+1152.549493334" Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.332281 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" podStartSLOduration=3.916259031 podStartE2EDuration="4.332259265s" podCreationTimestamp="2026-03-01 09:26:59 +0000 UTC" firstStartedPulling="2026-03-01 09:27:00.669807646 +0000 UTC m=+1149.911686843" lastFinishedPulling="2026-03-01 09:27:01.08580788 +0000 UTC m=+1150.327687077" observedRunningTime="2026-03-01 09:27:03.32367454 +0000 UTC m=+1152.565553777" watchObservedRunningTime="2026-03-01 09:27:03.332259265 +0000 UTC m=+1152.574138472" Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.343806 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn2gw\" (UniqueName: \"kubernetes.io/projected/77225d3a-9f2d-4aaf-8b98-6dc5310db3da-kube-api-access-tn2gw\") pod \"root-account-create-update-j6lbg\" (UID: \"77225d3a-9f2d-4aaf-8b98-6dc5310db3da\") " pod="openstack/root-account-create-update-j6lbg" Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.344078 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77225d3a-9f2d-4aaf-8b98-6dc5310db3da-operator-scripts\") pod \"root-account-create-update-j6lbg\" (UID: \"77225d3a-9f2d-4aaf-8b98-6dc5310db3da\") " pod="openstack/root-account-create-update-j6lbg" Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.345285 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77225d3a-9f2d-4aaf-8b98-6dc5310db3da-operator-scripts\") pod \"root-account-create-update-j6lbg\" (UID: \"77225d3a-9f2d-4aaf-8b98-6dc5310db3da\") " pod="openstack/root-account-create-update-j6lbg" Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.351940 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.330566242 podStartE2EDuration="4.351922113s" podCreationTimestamp="2026-03-01 09:26:59 +0000 UTC" firstStartedPulling="2026-03-01 09:27:00.966075166 +0000 UTC m=+1150.207954363" lastFinishedPulling="2026-03-01 09:27:01.987431037 +0000 UTC m=+1151.229310234" observedRunningTime="2026-03-01 09:27:03.342803696 +0000 UTC m=+1152.584682923" watchObservedRunningTime="2026-03-01 09:27:03.351922113 +0000 UTC m=+1152.593801310" Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.373999 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn2gw\" (UniqueName: \"kubernetes.io/projected/77225d3a-9f2d-4aaf-8b98-6dc5310db3da-kube-api-access-tn2gw\") pod \"root-account-create-update-j6lbg\" (UID: \"77225d3a-9f2d-4aaf-8b98-6dc5310db3da\") " pod="openstack/root-account-create-update-j6lbg" Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.405975 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-j6lbg" Mar 01 09:27:03 crc kubenswrapper[4792]: I0301 09:27:03.829783 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-j6lbg"] Mar 01 09:27:04 crc kubenswrapper[4792]: I0301 09:27:04.298856 4792 generic.go:334] "Generic (PLEG): container finished" podID="77225d3a-9f2d-4aaf-8b98-6dc5310db3da" containerID="6dd143b9e9badd592279cca432fe539c49e92a79ca469f608516d1e967d18c73" exitCode=0 Mar 01 09:27:04 crc kubenswrapper[4792]: I0301 09:27:04.298935 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-j6lbg" event={"ID":"77225d3a-9f2d-4aaf-8b98-6dc5310db3da","Type":"ContainerDied","Data":"6dd143b9e9badd592279cca432fe539c49e92a79ca469f608516d1e967d18c73"} Mar 01 09:27:04 crc kubenswrapper[4792]: I0301 09:27:04.298965 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-j6lbg" event={"ID":"77225d3a-9f2d-4aaf-8b98-6dc5310db3da","Type":"ContainerStarted","Data":"955029b7b4fc653d1ccc05aa7f77cb849f104e2da77ccb3a30ce4c53323eba36"} Mar 01 09:27:04 crc kubenswrapper[4792]: I0301 09:27:04.943224 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:27:04 crc kubenswrapper[4792]: I0301 09:27:04.943503 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:27:04 crc kubenswrapper[4792]: I0301 09:27:04.943535 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:27:04 crc kubenswrapper[4792]: I0301 09:27:04.944211 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9fc8b9702d9d3591695478729a2a209996e2f83219ba8649a31afc02f286ad3f"} pod="openshift-machine-config-operator/machine-config-daemon-bqszv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 01 09:27:04 crc kubenswrapper[4792]: I0301 09:27:04.944270 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" containerID="cri-o://9fc8b9702d9d3591695478729a2a209996e2f83219ba8649a31afc02f286ad3f" gracePeriod=600 Mar 01 09:27:05 crc kubenswrapper[4792]: I0301 09:27:05.308398 4792 generic.go:334] "Generic (PLEG): container finished" podID="9105f6b0-6f16-47aa-8009-73736a90b765" containerID="9fc8b9702d9d3591695478729a2a209996e2f83219ba8649a31afc02f286ad3f" exitCode=0 Mar 01 09:27:05 crc kubenswrapper[4792]: I0301 09:27:05.308591 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerDied","Data":"9fc8b9702d9d3591695478729a2a209996e2f83219ba8649a31afc02f286ad3f"} Mar 01 09:27:05 crc kubenswrapper[4792]: I0301 09:27:05.308622 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"d3c51b46f8c635f0bd922e6a816c6cbadeb855fe42f4d60474cde44514e4e4de"} Mar 01 09:27:05 crc kubenswrapper[4792]: I0301 09:27:05.308637 4792 scope.go:117] "RemoveContainer" containerID="f223d76bf80d673696c61fa09c13cc363c202a24d360c9e2da1e52c335e05521" Mar 01 09:27:05 crc kubenswrapper[4792]: I0301 09:27:05.654668 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-j6lbg" Mar 01 09:27:05 crc kubenswrapper[4792]: I0301 09:27:05.719583 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 01 09:27:05 crc kubenswrapper[4792]: I0301 09:27:05.719643 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 01 09:27:05 crc kubenswrapper[4792]: I0301 09:27:05.788799 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn2gw\" (UniqueName: \"kubernetes.io/projected/77225d3a-9f2d-4aaf-8b98-6dc5310db3da-kube-api-access-tn2gw\") pod \"77225d3a-9f2d-4aaf-8b98-6dc5310db3da\" (UID: \"77225d3a-9f2d-4aaf-8b98-6dc5310db3da\") " Mar 01 09:27:05 crc kubenswrapper[4792]: I0301 09:27:05.788961 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77225d3a-9f2d-4aaf-8b98-6dc5310db3da-operator-scripts\") pod \"77225d3a-9f2d-4aaf-8b98-6dc5310db3da\" (UID: \"77225d3a-9f2d-4aaf-8b98-6dc5310db3da\") " Mar 01 09:27:05 crc kubenswrapper[4792]: I0301 09:27:05.789784 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77225d3a-9f2d-4aaf-8b98-6dc5310db3da-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "77225d3a-9f2d-4aaf-8b98-6dc5310db3da" (UID: "77225d3a-9f2d-4aaf-8b98-6dc5310db3da"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:05 crc kubenswrapper[4792]: I0301 09:27:05.793430 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 01 09:27:05 crc kubenswrapper[4792]: I0301 09:27:05.800153 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77225d3a-9f2d-4aaf-8b98-6dc5310db3da-kube-api-access-tn2gw" (OuterVolumeSpecName: "kube-api-access-tn2gw") pod "77225d3a-9f2d-4aaf-8b98-6dc5310db3da" (UID: "77225d3a-9f2d-4aaf-8b98-6dc5310db3da"). InnerVolumeSpecName "kube-api-access-tn2gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:05 crc kubenswrapper[4792]: I0301 09:27:05.891276 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tn2gw\" (UniqueName: \"kubernetes.io/projected/77225d3a-9f2d-4aaf-8b98-6dc5310db3da-kube-api-access-tn2gw\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:05 crc kubenswrapper[4792]: I0301 09:27:05.891625 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77225d3a-9f2d-4aaf-8b98-6dc5310db3da-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.191368 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-dlv4c"] Mar 01 09:27:06 crc kubenswrapper[4792]: E0301 09:27:06.191927 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77225d3a-9f2d-4aaf-8b98-6dc5310db3da" containerName="mariadb-account-create-update" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.192016 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="77225d3a-9f2d-4aaf-8b98-6dc5310db3da" containerName="mariadb-account-create-update" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.192224 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="77225d3a-9f2d-4aaf-8b98-6dc5310db3da" containerName="mariadb-account-create-update" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.192745 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dlv4c" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.206315 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-dlv4c"] Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.285880 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-2c3a-account-create-update-vnvfb"] Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.287108 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2c3a-account-create-update-vnvfb" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.291279 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.296596 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5-operator-scripts\") pod \"glance-db-create-dlv4c\" (UID: \"192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5\") " pod="openstack/glance-db-create-dlv4c" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.296687 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2pr2\" (UniqueName: \"kubernetes.io/projected/192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5-kube-api-access-v2pr2\") pod \"glance-db-create-dlv4c\" (UID: \"192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5\") " pod="openstack/glance-db-create-dlv4c" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.299588 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-2c3a-account-create-update-vnvfb"] Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.326363 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-j6lbg" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.326359 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-j6lbg" event={"ID":"77225d3a-9f2d-4aaf-8b98-6dc5310db3da","Type":"ContainerDied","Data":"955029b7b4fc653d1ccc05aa7f77cb849f104e2da77ccb3a30ce4c53323eba36"} Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.327535 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="955029b7b4fc653d1ccc05aa7f77cb849f104e2da77ccb3a30ce4c53323eba36" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.397560 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2pr2\" (UniqueName: \"kubernetes.io/projected/192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5-kube-api-access-v2pr2\") pod \"glance-db-create-dlv4c\" (UID: \"192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5\") " pod="openstack/glance-db-create-dlv4c" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.397646 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/869a99e5-f399-4938-ba59-bbe20e23385b-operator-scripts\") pod \"glance-2c3a-account-create-update-vnvfb\" (UID: \"869a99e5-f399-4938-ba59-bbe20e23385b\") " pod="openstack/glance-2c3a-account-create-update-vnvfb" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.397680 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db95w\" (UniqueName: \"kubernetes.io/projected/869a99e5-f399-4938-ba59-bbe20e23385b-kube-api-access-db95w\") pod \"glance-2c3a-account-create-update-vnvfb\" (UID: \"869a99e5-f399-4938-ba59-bbe20e23385b\") " pod="openstack/glance-2c3a-account-create-update-vnvfb" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.397768 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5-operator-scripts\") pod \"glance-db-create-dlv4c\" (UID: \"192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5\") " pod="openstack/glance-db-create-dlv4c" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.399353 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5-operator-scripts\") pod \"glance-db-create-dlv4c\" (UID: \"192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5\") " pod="openstack/glance-db-create-dlv4c" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.423499 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.424117 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2pr2\" (UniqueName: \"kubernetes.io/projected/192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5-kube-api-access-v2pr2\") pod \"glance-db-create-dlv4c\" (UID: \"192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5\") " pod="openstack/glance-db-create-dlv4c" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.499112 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/869a99e5-f399-4938-ba59-bbe20e23385b-operator-scripts\") pod \"glance-2c3a-account-create-update-vnvfb\" (UID: \"869a99e5-f399-4938-ba59-bbe20e23385b\") " pod="openstack/glance-2c3a-account-create-update-vnvfb" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.499820 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/869a99e5-f399-4938-ba59-bbe20e23385b-operator-scripts\") pod \"glance-2c3a-account-create-update-vnvfb\" (UID: \"869a99e5-f399-4938-ba59-bbe20e23385b\") " pod="openstack/glance-2c3a-account-create-update-vnvfb" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.499879 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db95w\" (UniqueName: \"kubernetes.io/projected/869a99e5-f399-4938-ba59-bbe20e23385b-kube-api-access-db95w\") pod \"glance-2c3a-account-create-update-vnvfb\" (UID: \"869a99e5-f399-4938-ba59-bbe20e23385b\") " pod="openstack/glance-2c3a-account-create-update-vnvfb" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.511321 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dlv4c" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.526456 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db95w\" (UniqueName: \"kubernetes.io/projected/869a99e5-f399-4938-ba59-bbe20e23385b-kube-api-access-db95w\") pod \"glance-2c3a-account-create-update-vnvfb\" (UID: \"869a99e5-f399-4938-ba59-bbe20e23385b\") " pod="openstack/glance-2c3a-account-create-update-vnvfb" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.619852 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2c3a-account-create-update-vnvfb" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.967493 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-f95nh"] Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.968648 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-f95nh" Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.983232 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-f95nh"] Mar 01 09:27:06 crc kubenswrapper[4792]: I0301 09:27:06.991276 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-dlv4c"] Mar 01 09:27:06 crc kubenswrapper[4792]: W0301 09:27:06.999028 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod192b539c_c4b9_4c4e_93e3_23b6dc0d7ec5.slice/crio-09c567031c462356ea2900b9a1ef75321e0cbd49a89efe5f19a9b5a3e329bfaf WatchSource:0}: Error finding container 09c567031c462356ea2900b9a1ef75321e0cbd49a89efe5f19a9b5a3e329bfaf: Status 404 returned error can't find the container with id 09c567031c462356ea2900b9a1ef75321e0cbd49a89efe5f19a9b5a3e329bfaf Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.077278 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-11c1-account-create-update-8h9xf"] Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.078168 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-11c1-account-create-update-8h9xf" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.082891 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.091621 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-11c1-account-create-update-8h9xf"] Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.116460 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqd46\" (UniqueName: \"kubernetes.io/projected/127158ae-b49c-42bd-932d-af85eafce8c0-kube-api-access-vqd46\") pod \"keystone-db-create-f95nh\" (UID: \"127158ae-b49c-42bd-932d-af85eafce8c0\") " pod="openstack/keystone-db-create-f95nh" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.116509 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/127158ae-b49c-42bd-932d-af85eafce8c0-operator-scripts\") pod \"keystone-db-create-f95nh\" (UID: \"127158ae-b49c-42bd-932d-af85eafce8c0\") " pod="openstack/keystone-db-create-f95nh" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.191643 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-8zsss"] Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.193326 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8zsss" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.217845 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/127158ae-b49c-42bd-932d-af85eafce8c0-operator-scripts\") pod \"keystone-db-create-f95nh\" (UID: \"127158ae-b49c-42bd-932d-af85eafce8c0\") " pod="openstack/keystone-db-create-f95nh" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.218057 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj9z6\" (UniqueName: \"kubernetes.io/projected/46d8b4e1-c1b5-468c-b319-84985c525d6a-kube-api-access-zj9z6\") pod \"keystone-11c1-account-create-update-8h9xf\" (UID: \"46d8b4e1-c1b5-468c-b319-84985c525d6a\") " pod="openstack/keystone-11c1-account-create-update-8h9xf" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.218106 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46d8b4e1-c1b5-468c-b319-84985c525d6a-operator-scripts\") pod \"keystone-11c1-account-create-update-8h9xf\" (UID: \"46d8b4e1-c1b5-468c-b319-84985c525d6a\") " pod="openstack/keystone-11c1-account-create-update-8h9xf" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.218125 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqd46\" (UniqueName: \"kubernetes.io/projected/127158ae-b49c-42bd-932d-af85eafce8c0-kube-api-access-vqd46\") pod \"keystone-db-create-f95nh\" (UID: \"127158ae-b49c-42bd-932d-af85eafce8c0\") " pod="openstack/keystone-db-create-f95nh" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.218707 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-8zsss"] Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.219274 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/127158ae-b49c-42bd-932d-af85eafce8c0-operator-scripts\") pod \"keystone-db-create-f95nh\" (UID: \"127158ae-b49c-42bd-932d-af85eafce8c0\") " pod="openstack/keystone-db-create-f95nh" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.246663 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqd46\" (UniqueName: \"kubernetes.io/projected/127158ae-b49c-42bd-932d-af85eafce8c0-kube-api-access-vqd46\") pod \"keystone-db-create-f95nh\" (UID: \"127158ae-b49c-42bd-932d-af85eafce8c0\") " pod="openstack/keystone-db-create-f95nh" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.249536 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-2c3a-account-create-update-vnvfb"] Mar 01 09:27:07 crc kubenswrapper[4792]: W0301 09:27:07.255072 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod869a99e5_f399_4938_ba59_bbe20e23385b.slice/crio-2a94915c9a7958c2ca19c1867a8b54e166c8909a2e2ab8660c53f3728dff551a WatchSource:0}: Error finding container 2a94915c9a7958c2ca19c1867a8b54e166c8909a2e2ab8660c53f3728dff551a: Status 404 returned error can't find the container with id 2a94915c9a7958c2ca19c1867a8b54e166c8909a2e2ab8660c53f3728dff551a Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.275444 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d8c5-account-create-update-z4zgs"] Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.277525 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d8c5-account-create-update-z4zgs" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.288465 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.296403 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d8c5-account-create-update-z4zgs"] Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.296696 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-f95nh" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.320047 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxrtr\" (UniqueName: \"kubernetes.io/projected/272107df-b15b-4c97-b9b0-e865f9a391da-kube-api-access-rxrtr\") pod \"placement-db-create-8zsss\" (UID: \"272107df-b15b-4c97-b9b0-e865f9a391da\") " pod="openstack/placement-db-create-8zsss" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.320142 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj9z6\" (UniqueName: \"kubernetes.io/projected/46d8b4e1-c1b5-468c-b319-84985c525d6a-kube-api-access-zj9z6\") pod \"keystone-11c1-account-create-update-8h9xf\" (UID: \"46d8b4e1-c1b5-468c-b319-84985c525d6a\") " pod="openstack/keystone-11c1-account-create-update-8h9xf" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.320202 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/272107df-b15b-4c97-b9b0-e865f9a391da-operator-scripts\") pod \"placement-db-create-8zsss\" (UID: \"272107df-b15b-4c97-b9b0-e865f9a391da\") " pod="openstack/placement-db-create-8zsss" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.320262 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46d8b4e1-c1b5-468c-b319-84985c525d6a-operator-scripts\") pod \"keystone-11c1-account-create-update-8h9xf\" (UID: \"46d8b4e1-c1b5-468c-b319-84985c525d6a\") " pod="openstack/keystone-11c1-account-create-update-8h9xf" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.321159 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46d8b4e1-c1b5-468c-b319-84985c525d6a-operator-scripts\") pod \"keystone-11c1-account-create-update-8h9xf\" (UID: \"46d8b4e1-c1b5-468c-b319-84985c525d6a\") " pod="openstack/keystone-11c1-account-create-update-8h9xf" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.336459 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj9z6\" (UniqueName: \"kubernetes.io/projected/46d8b4e1-c1b5-468c-b319-84985c525d6a-kube-api-access-zj9z6\") pod \"keystone-11c1-account-create-update-8h9xf\" (UID: \"46d8b4e1-c1b5-468c-b319-84985c525d6a\") " pod="openstack/keystone-11c1-account-create-update-8h9xf" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.345147 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dlv4c" event={"ID":"192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5","Type":"ContainerStarted","Data":"09c567031c462356ea2900b9a1ef75321e0cbd49a89efe5f19a9b5a3e329bfaf"} Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.346842 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2c3a-account-create-update-vnvfb" event={"ID":"869a99e5-f399-4938-ba59-bbe20e23385b","Type":"ContainerStarted","Data":"2a94915c9a7958c2ca19c1867a8b54e166c8909a2e2ab8660c53f3728dff551a"} Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.390659 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-11c1-account-create-update-8h9xf" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.429779 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58e6fcdd-44b2-4c03-9cf6-a772bd0c3779-operator-scripts\") pod \"placement-d8c5-account-create-update-z4zgs\" (UID: \"58e6fcdd-44b2-4c03-9cf6-a772bd0c3779\") " pod="openstack/placement-d8c5-account-create-update-z4zgs" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.429852 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxrtr\" (UniqueName: \"kubernetes.io/projected/272107df-b15b-4c97-b9b0-e865f9a391da-kube-api-access-rxrtr\") pod \"placement-db-create-8zsss\" (UID: \"272107df-b15b-4c97-b9b0-e865f9a391da\") " pod="openstack/placement-db-create-8zsss" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.429941 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/272107df-b15b-4c97-b9b0-e865f9a391da-operator-scripts\") pod \"placement-db-create-8zsss\" (UID: \"272107df-b15b-4c97-b9b0-e865f9a391da\") " pod="openstack/placement-db-create-8zsss" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.429987 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9s2d\" (UniqueName: \"kubernetes.io/projected/58e6fcdd-44b2-4c03-9cf6-a772bd0c3779-kube-api-access-j9s2d\") pod \"placement-d8c5-account-create-update-z4zgs\" (UID: \"58e6fcdd-44b2-4c03-9cf6-a772bd0c3779\") " pod="openstack/placement-d8c5-account-create-update-z4zgs" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.430938 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/272107df-b15b-4c97-b9b0-e865f9a391da-operator-scripts\") pod \"placement-db-create-8zsss\" (UID: \"272107df-b15b-4c97-b9b0-e865f9a391da\") " pod="openstack/placement-db-create-8zsss" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.449721 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxrtr\" (UniqueName: \"kubernetes.io/projected/272107df-b15b-4c97-b9b0-e865f9a391da-kube-api-access-rxrtr\") pod \"placement-db-create-8zsss\" (UID: \"272107df-b15b-4c97-b9b0-e865f9a391da\") " pod="openstack/placement-db-create-8zsss" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.510654 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8zsss" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.531234 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9s2d\" (UniqueName: \"kubernetes.io/projected/58e6fcdd-44b2-4c03-9cf6-a772bd0c3779-kube-api-access-j9s2d\") pod \"placement-d8c5-account-create-update-z4zgs\" (UID: \"58e6fcdd-44b2-4c03-9cf6-a772bd0c3779\") " pod="openstack/placement-d8c5-account-create-update-z4zgs" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.532401 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58e6fcdd-44b2-4c03-9cf6-a772bd0c3779-operator-scripts\") pod \"placement-d8c5-account-create-update-z4zgs\" (UID: \"58e6fcdd-44b2-4c03-9cf6-a772bd0c3779\") " pod="openstack/placement-d8c5-account-create-update-z4zgs" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.533573 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58e6fcdd-44b2-4c03-9cf6-a772bd0c3779-operator-scripts\") pod \"placement-d8c5-account-create-update-z4zgs\" (UID: \"58e6fcdd-44b2-4c03-9cf6-a772bd0c3779\") " pod="openstack/placement-d8c5-account-create-update-z4zgs" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.566537 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9s2d\" (UniqueName: \"kubernetes.io/projected/58e6fcdd-44b2-4c03-9cf6-a772bd0c3779-kube-api-access-j9s2d\") pod \"placement-d8c5-account-create-update-z4zgs\" (UID: \"58e6fcdd-44b2-4c03-9cf6-a772bd0c3779\") " pod="openstack/placement-d8c5-account-create-update-z4zgs" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.611225 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d8c5-account-create-update-z4zgs" Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.871443 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-f95nh"] Mar 01 09:27:07 crc kubenswrapper[4792]: W0301 09:27:07.875155 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod127158ae_b49c_42bd_932d_af85eafce8c0.slice/crio-9a99c4b2686168f24fce829e1a3e70dd78fb16bebec4c0aa6697c70dee318f7b WatchSource:0}: Error finding container 9a99c4b2686168f24fce829e1a3e70dd78fb16bebec4c0aa6697c70dee318f7b: Status 404 returned error can't find the container with id 9a99c4b2686168f24fce829e1a3e70dd78fb16bebec4c0aa6697c70dee318f7b Mar 01 09:27:07 crc kubenswrapper[4792]: I0301 09:27:07.946363 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-11c1-account-create-update-8h9xf"] Mar 01 09:27:08 crc kubenswrapper[4792]: I0301 09:27:08.055670 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-8zsss"] Mar 01 09:27:08 crc kubenswrapper[4792]: W0301 09:27:08.072102 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod272107df_b15b_4c97_b9b0_e865f9a391da.slice/crio-63bb42f2da2fd3c9a15586710b3a6b5b1ba3e77ac05b9db5ef3689f411e43763 WatchSource:0}: Error finding container 63bb42f2da2fd3c9a15586710b3a6b5b1ba3e77ac05b9db5ef3689f411e43763: Status 404 returned error can't find the container with id 63bb42f2da2fd3c9a15586710b3a6b5b1ba3e77ac05b9db5ef3689f411e43763 Mar 01 09:27:08 crc kubenswrapper[4792]: I0301 09:27:08.186460 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d8c5-account-create-update-z4zgs"] Mar 01 09:27:08 crc kubenswrapper[4792]: I0301 09:27:08.356242 4792 generic.go:334] "Generic (PLEG): container finished" podID="869a99e5-f399-4938-ba59-bbe20e23385b" containerID="5c3a4231cfc20731f9ac2774fb470c532f5db1e9d44253c60e2e47577fa458dc" exitCode=0 Mar 01 09:27:08 crc kubenswrapper[4792]: I0301 09:27:08.356308 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2c3a-account-create-update-vnvfb" event={"ID":"869a99e5-f399-4938-ba59-bbe20e23385b","Type":"ContainerDied","Data":"5c3a4231cfc20731f9ac2774fb470c532f5db1e9d44253c60e2e47577fa458dc"} Mar 01 09:27:08 crc kubenswrapper[4792]: I0301 09:27:08.357536 4792 generic.go:334] "Generic (PLEG): container finished" podID="127158ae-b49c-42bd-932d-af85eafce8c0" containerID="b99cdab13c59b3d72ed63dfb54dc704e52617818eb25d09b6ad0f435b22c114f" exitCode=0 Mar 01 09:27:08 crc kubenswrapper[4792]: I0301 09:27:08.357607 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-f95nh" event={"ID":"127158ae-b49c-42bd-932d-af85eafce8c0","Type":"ContainerDied","Data":"b99cdab13c59b3d72ed63dfb54dc704e52617818eb25d09b6ad0f435b22c114f"} Mar 01 09:27:08 crc kubenswrapper[4792]: I0301 09:27:08.357633 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-f95nh" event={"ID":"127158ae-b49c-42bd-932d-af85eafce8c0","Type":"ContainerStarted","Data":"9a99c4b2686168f24fce829e1a3e70dd78fb16bebec4c0aa6697c70dee318f7b"} Mar 01 09:27:08 crc kubenswrapper[4792]: I0301 09:27:08.359780 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d8c5-account-create-update-z4zgs" event={"ID":"58e6fcdd-44b2-4c03-9cf6-a772bd0c3779","Type":"ContainerStarted","Data":"4bfc939eefab66aa8b4620743c13ae7fca651fc6f905e4eef02dd74188eabdb0"} Mar 01 09:27:08 crc kubenswrapper[4792]: I0301 09:27:08.361001 4792 generic.go:334] "Generic (PLEG): container finished" podID="192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5" containerID="0efefdd3dac5f3f586a3c6d6e7f2ba1305e9a8e8544b4e285a4ab7c3e12e8018" exitCode=0 Mar 01 09:27:08 crc kubenswrapper[4792]: I0301 09:27:08.361055 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dlv4c" event={"ID":"192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5","Type":"ContainerDied","Data":"0efefdd3dac5f3f586a3c6d6e7f2ba1305e9a8e8544b4e285a4ab7c3e12e8018"} Mar 01 09:27:08 crc kubenswrapper[4792]: I0301 09:27:08.362043 4792 generic.go:334] "Generic (PLEG): container finished" podID="46d8b4e1-c1b5-468c-b319-84985c525d6a" containerID="79a53da4edce2f856b264b84a40ae3b0fe791d8730afd70b1bb1b19a59aff3f9" exitCode=0 Mar 01 09:27:08 crc kubenswrapper[4792]: I0301 09:27:08.362088 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-11c1-account-create-update-8h9xf" event={"ID":"46d8b4e1-c1b5-468c-b319-84985c525d6a","Type":"ContainerDied","Data":"79a53da4edce2f856b264b84a40ae3b0fe791d8730afd70b1bb1b19a59aff3f9"} Mar 01 09:27:08 crc kubenswrapper[4792]: I0301 09:27:08.362103 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-11c1-account-create-update-8h9xf" event={"ID":"46d8b4e1-c1b5-468c-b319-84985c525d6a","Type":"ContainerStarted","Data":"013b3e77f344d3b66e5bc6a232922ce5e3b73c8cdd4b83b86d6000024b74e7d3"} Mar 01 09:27:08 crc kubenswrapper[4792]: I0301 09:27:08.364100 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8zsss" event={"ID":"272107df-b15b-4c97-b9b0-e865f9a391da","Type":"ContainerStarted","Data":"98147a64ef321c5a2be94da9872cf3444a2d6ee5365cb23fe0e8d40238b6ab98"} Mar 01 09:27:08 crc kubenswrapper[4792]: I0301 09:27:08.364124 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8zsss" event={"ID":"272107df-b15b-4c97-b9b0-e865f9a391da","Type":"ContainerStarted","Data":"63bb42f2da2fd3c9a15586710b3a6b5b1ba3e77ac05b9db5ef3689f411e43763"} Mar 01 09:27:08 crc kubenswrapper[4792]: I0301 09:27:08.407946 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-8zsss" podStartSLOduration=1.407929183 podStartE2EDuration="1.407929183s" podCreationTimestamp="2026-03-01 09:27:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:27:08.406180392 +0000 UTC m=+1157.648059589" watchObservedRunningTime="2026-03-01 09:27:08.407929183 +0000 UTC m=+1157.649808380" Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.371447 4792 generic.go:334] "Generic (PLEG): container finished" podID="272107df-b15b-4c97-b9b0-e865f9a391da" containerID="98147a64ef321c5a2be94da9872cf3444a2d6ee5365cb23fe0e8d40238b6ab98" exitCode=0 Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.371699 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8zsss" event={"ID":"272107df-b15b-4c97-b9b0-e865f9a391da","Type":"ContainerDied","Data":"98147a64ef321c5a2be94da9872cf3444a2d6ee5365cb23fe0e8d40238b6ab98"} Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.373792 4792 generic.go:334] "Generic (PLEG): container finished" podID="58e6fcdd-44b2-4c03-9cf6-a772bd0c3779" containerID="3b9a5bf9216213ab73f7db6aa95b33bd1c546b1770c33a00558994664a8fc4ce" exitCode=0 Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.373891 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d8c5-account-create-update-z4zgs" event={"ID":"58e6fcdd-44b2-4c03-9cf6-a772bd0c3779","Type":"ContainerDied","Data":"3b9a5bf9216213ab73f7db6aa95b33bd1c546b1770c33a00558994664a8fc4ce"} Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.692734 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-f95nh" Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.799951 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2c3a-account-create-update-vnvfb" Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.875183 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqd46\" (UniqueName: \"kubernetes.io/projected/127158ae-b49c-42bd-932d-af85eafce8c0-kube-api-access-vqd46\") pod \"127158ae-b49c-42bd-932d-af85eafce8c0\" (UID: \"127158ae-b49c-42bd-932d-af85eafce8c0\") " Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.875220 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/127158ae-b49c-42bd-932d-af85eafce8c0-operator-scripts\") pod \"127158ae-b49c-42bd-932d-af85eafce8c0\" (UID: \"127158ae-b49c-42bd-932d-af85eafce8c0\") " Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.876516 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/127158ae-b49c-42bd-932d-af85eafce8c0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "127158ae-b49c-42bd-932d-af85eafce8c0" (UID: "127158ae-b49c-42bd-932d-af85eafce8c0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.882292 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/127158ae-b49c-42bd-932d-af85eafce8c0-kube-api-access-vqd46" (OuterVolumeSpecName: "kube-api-access-vqd46") pod "127158ae-b49c-42bd-932d-af85eafce8c0" (UID: "127158ae-b49c-42bd-932d-af85eafce8c0"). InnerVolumeSpecName "kube-api-access-vqd46". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.918596 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dlv4c" Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.927121 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-11c1-account-create-update-8h9xf" Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.959590 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.976588 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/869a99e5-f399-4938-ba59-bbe20e23385b-operator-scripts\") pod \"869a99e5-f399-4938-ba59-bbe20e23385b\" (UID: \"869a99e5-f399-4938-ba59-bbe20e23385b\") " Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.976642 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-db95w\" (UniqueName: \"kubernetes.io/projected/869a99e5-f399-4938-ba59-bbe20e23385b-kube-api-access-db95w\") pod \"869a99e5-f399-4938-ba59-bbe20e23385b\" (UID: \"869a99e5-f399-4938-ba59-bbe20e23385b\") " Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.977027 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/869a99e5-f399-4938-ba59-bbe20e23385b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "869a99e5-f399-4938-ba59-bbe20e23385b" (UID: "869a99e5-f399-4938-ba59-bbe20e23385b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.977106 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqd46\" (UniqueName: \"kubernetes.io/projected/127158ae-b49c-42bd-932d-af85eafce8c0-kube-api-access-vqd46\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.977125 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/127158ae-b49c-42bd-932d-af85eafce8c0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:09 crc kubenswrapper[4792]: I0301 09:27:09.982779 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/869a99e5-f399-4938-ba59-bbe20e23385b-kube-api-access-db95w" (OuterVolumeSpecName: "kube-api-access-db95w") pod "869a99e5-f399-4938-ba59-bbe20e23385b" (UID: "869a99e5-f399-4938-ba59-bbe20e23385b"). InnerVolumeSpecName "kube-api-access-db95w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.078128 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zj9z6\" (UniqueName: \"kubernetes.io/projected/46d8b4e1-c1b5-468c-b319-84985c525d6a-kube-api-access-zj9z6\") pod \"46d8b4e1-c1b5-468c-b319-84985c525d6a\" (UID: \"46d8b4e1-c1b5-468c-b319-84985c525d6a\") " Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.078233 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46d8b4e1-c1b5-468c-b319-84985c525d6a-operator-scripts\") pod \"46d8b4e1-c1b5-468c-b319-84985c525d6a\" (UID: \"46d8b4e1-c1b5-468c-b319-84985c525d6a\") " Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.078301 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5-operator-scripts\") pod \"192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5\" (UID: \"192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5\") " Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.078321 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2pr2\" (UniqueName: \"kubernetes.io/projected/192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5-kube-api-access-v2pr2\") pod \"192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5\" (UID: \"192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5\") " Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.078742 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-db95w\" (UniqueName: \"kubernetes.io/projected/869a99e5-f399-4938-ba59-bbe20e23385b-kube-api-access-db95w\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.078759 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/869a99e5-f399-4938-ba59-bbe20e23385b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.079136 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46d8b4e1-c1b5-468c-b319-84985c525d6a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "46d8b4e1-c1b5-468c-b319-84985c525d6a" (UID: "46d8b4e1-c1b5-468c-b319-84985c525d6a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.079606 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5" (UID: "192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.083387 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5-kube-api-access-v2pr2" (OuterVolumeSpecName: "kube-api-access-v2pr2") pod "192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5" (UID: "192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5"). InnerVolumeSpecName "kube-api-access-v2pr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.083400 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46d8b4e1-c1b5-468c-b319-84985c525d6a-kube-api-access-zj9z6" (OuterVolumeSpecName: "kube-api-access-zj9z6") pod "46d8b4e1-c1b5-468c-b319-84985c525d6a" (UID: "46d8b4e1-c1b5-468c-b319-84985c525d6a"). InnerVolumeSpecName "kube-api-access-zj9z6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.179948 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.180129 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2pr2\" (UniqueName: \"kubernetes.io/projected/192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5-kube-api-access-v2pr2\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.180208 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zj9z6\" (UniqueName: \"kubernetes.io/projected/46d8b4e1-c1b5-468c-b319-84985c525d6a-kube-api-access-zj9z6\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.180263 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46d8b4e1-c1b5-468c-b319-84985c525d6a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.381428 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2c3a-account-create-update-vnvfb" event={"ID":"869a99e5-f399-4938-ba59-bbe20e23385b","Type":"ContainerDied","Data":"2a94915c9a7958c2ca19c1867a8b54e166c8909a2e2ab8660c53f3728dff551a"} Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.381750 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a94915c9a7958c2ca19c1867a8b54e166c8909a2e2ab8660c53f3728dff551a" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.381522 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2c3a-account-create-update-vnvfb" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.385370 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-f95nh" event={"ID":"127158ae-b49c-42bd-932d-af85eafce8c0","Type":"ContainerDied","Data":"9a99c4b2686168f24fce829e1a3e70dd78fb16bebec4c0aa6697c70dee318f7b"} Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.385458 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a99c4b2686168f24fce829e1a3e70dd78fb16bebec4c0aa6697c70dee318f7b" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.385588 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-f95nh" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.391512 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.392435 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dlv4c" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.392483 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dlv4c" event={"ID":"192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5","Type":"ContainerDied","Data":"09c567031c462356ea2900b9a1ef75321e0cbd49a89efe5f19a9b5a3e329bfaf"} Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.392521 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09c567031c462356ea2900b9a1ef75321e0cbd49a89efe5f19a9b5a3e329bfaf" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.394258 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-11c1-account-create-update-8h9xf" event={"ID":"46d8b4e1-c1b5-468c-b319-84985c525d6a","Type":"ContainerDied","Data":"013b3e77f344d3b66e5bc6a232922ce5e3b73c8cdd4b83b86d6000024b74e7d3"} Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.394292 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="013b3e77f344d3b66e5bc6a232922ce5e3b73c8cdd4b83b86d6000024b74e7d3" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.394414 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-11c1-account-create-update-8h9xf" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.474692 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-j6c8g"] Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.474914 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" podUID="87678b56-0909-4735-ad6b-cb992dc86853" containerName="dnsmasq-dns" containerID="cri-o://145f85c9e4bed69559d3fa2321198a2801d674c147a18cd06a91ee52950e9894" gracePeriod=10 Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.863009 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d8c5-account-create-update-z4zgs" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.872836 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8zsss" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.993394 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9s2d\" (UniqueName: \"kubernetes.io/projected/58e6fcdd-44b2-4c03-9cf6-a772bd0c3779-kube-api-access-j9s2d\") pod \"58e6fcdd-44b2-4c03-9cf6-a772bd0c3779\" (UID: \"58e6fcdd-44b2-4c03-9cf6-a772bd0c3779\") " Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.993787 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58e6fcdd-44b2-4c03-9cf6-a772bd0c3779-operator-scripts\") pod \"58e6fcdd-44b2-4c03-9cf6-a772bd0c3779\" (UID: \"58e6fcdd-44b2-4c03-9cf6-a772bd0c3779\") " Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.993870 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxrtr\" (UniqueName: \"kubernetes.io/projected/272107df-b15b-4c97-b9b0-e865f9a391da-kube-api-access-rxrtr\") pod \"272107df-b15b-4c97-b9b0-e865f9a391da\" (UID: \"272107df-b15b-4c97-b9b0-e865f9a391da\") " Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.993892 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/272107df-b15b-4c97-b9b0-e865f9a391da-operator-scripts\") pod \"272107df-b15b-4c97-b9b0-e865f9a391da\" (UID: \"272107df-b15b-4c97-b9b0-e865f9a391da\") " Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.994351 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58e6fcdd-44b2-4c03-9cf6-a772bd0c3779-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "58e6fcdd-44b2-4c03-9cf6-a772bd0c3779" (UID: "58e6fcdd-44b2-4c03-9cf6-a772bd0c3779"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:10 crc kubenswrapper[4792]: I0301 09:27:10.994541 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/272107df-b15b-4c97-b9b0-e865f9a391da-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "272107df-b15b-4c97-b9b0-e865f9a391da" (UID: "272107df-b15b-4c97-b9b0-e865f9a391da"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.003138 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/272107df-b15b-4c97-b9b0-e865f9a391da-kube-api-access-rxrtr" (OuterVolumeSpecName: "kube-api-access-rxrtr") pod "272107df-b15b-4c97-b9b0-e865f9a391da" (UID: "272107df-b15b-4c97-b9b0-e865f9a391da"). InnerVolumeSpecName "kube-api-access-rxrtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.003613 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58e6fcdd-44b2-4c03-9cf6-a772bd0c3779-kube-api-access-j9s2d" (OuterVolumeSpecName: "kube-api-access-j9s2d") pod "58e6fcdd-44b2-4c03-9cf6-a772bd0c3779" (UID: "58e6fcdd-44b2-4c03-9cf6-a772bd0c3779"). InnerVolumeSpecName "kube-api-access-j9s2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.027395 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.095550 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxrtr\" (UniqueName: \"kubernetes.io/projected/272107df-b15b-4c97-b9b0-e865f9a391da-kube-api-access-rxrtr\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.095581 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/272107df-b15b-4c97-b9b0-e865f9a391da-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.095590 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9s2d\" (UniqueName: \"kubernetes.io/projected/58e6fcdd-44b2-4c03-9cf6-a772bd0c3779-kube-api-access-j9s2d\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.095599 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58e6fcdd-44b2-4c03-9cf6-a772bd0c3779-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.196614 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87678b56-0909-4735-ad6b-cb992dc86853-config\") pod \"87678b56-0909-4735-ad6b-cb992dc86853\" (UID: \"87678b56-0909-4735-ad6b-cb992dc86853\") " Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.196684 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzbw9\" (UniqueName: \"kubernetes.io/projected/87678b56-0909-4735-ad6b-cb992dc86853-kube-api-access-tzbw9\") pod \"87678b56-0909-4735-ad6b-cb992dc86853\" (UID: \"87678b56-0909-4735-ad6b-cb992dc86853\") " Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.196780 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87678b56-0909-4735-ad6b-cb992dc86853-dns-svc\") pod \"87678b56-0909-4735-ad6b-cb992dc86853\" (UID: \"87678b56-0909-4735-ad6b-cb992dc86853\") " Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.196839 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87678b56-0909-4735-ad6b-cb992dc86853-ovsdbserver-nb\") pod \"87678b56-0909-4735-ad6b-cb992dc86853\" (UID: \"87678b56-0909-4735-ad6b-cb992dc86853\") " Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.200076 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87678b56-0909-4735-ad6b-cb992dc86853-kube-api-access-tzbw9" (OuterVolumeSpecName: "kube-api-access-tzbw9") pod "87678b56-0909-4735-ad6b-cb992dc86853" (UID: "87678b56-0909-4735-ad6b-cb992dc86853"). InnerVolumeSpecName "kube-api-access-tzbw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.231917 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87678b56-0909-4735-ad6b-cb992dc86853-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "87678b56-0909-4735-ad6b-cb992dc86853" (UID: "87678b56-0909-4735-ad6b-cb992dc86853"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.247402 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87678b56-0909-4735-ad6b-cb992dc86853-config" (OuterVolumeSpecName: "config") pod "87678b56-0909-4735-ad6b-cb992dc86853" (UID: "87678b56-0909-4735-ad6b-cb992dc86853"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.248325 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87678b56-0909-4735-ad6b-cb992dc86853-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "87678b56-0909-4735-ad6b-cb992dc86853" (UID: "87678b56-0909-4735-ad6b-cb992dc86853"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.298617 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87678b56-0909-4735-ad6b-cb992dc86853-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.298665 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87678b56-0909-4735-ad6b-cb992dc86853-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.298682 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87678b56-0909-4735-ad6b-cb992dc86853-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.298699 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzbw9\" (UniqueName: \"kubernetes.io/projected/87678b56-0909-4735-ad6b-cb992dc86853-kube-api-access-tzbw9\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.397105 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-ks68h"] Mar 01 09:27:11 crc kubenswrapper[4792]: E0301 09:27:11.397400 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5" containerName="mariadb-database-create" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.397413 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5" containerName="mariadb-database-create" Mar 01 09:27:11 crc kubenswrapper[4792]: E0301 09:27:11.397434 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58e6fcdd-44b2-4c03-9cf6-a772bd0c3779" containerName="mariadb-account-create-update" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.397439 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="58e6fcdd-44b2-4c03-9cf6-a772bd0c3779" containerName="mariadb-account-create-update" Mar 01 09:27:11 crc kubenswrapper[4792]: E0301 09:27:11.397450 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="272107df-b15b-4c97-b9b0-e865f9a391da" containerName="mariadb-database-create" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.397456 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="272107df-b15b-4c97-b9b0-e865f9a391da" containerName="mariadb-database-create" Mar 01 09:27:11 crc kubenswrapper[4792]: E0301 09:27:11.397465 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87678b56-0909-4735-ad6b-cb992dc86853" containerName="init" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.397471 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="87678b56-0909-4735-ad6b-cb992dc86853" containerName="init" Mar 01 09:27:11 crc kubenswrapper[4792]: E0301 09:27:11.397486 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="869a99e5-f399-4938-ba59-bbe20e23385b" containerName="mariadb-account-create-update" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.397492 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="869a99e5-f399-4938-ba59-bbe20e23385b" containerName="mariadb-account-create-update" Mar 01 09:27:11 crc kubenswrapper[4792]: E0301 09:27:11.397501 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87678b56-0909-4735-ad6b-cb992dc86853" containerName="dnsmasq-dns" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.397506 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="87678b56-0909-4735-ad6b-cb992dc86853" containerName="dnsmasq-dns" Mar 01 09:27:11 crc kubenswrapper[4792]: E0301 09:27:11.397519 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="127158ae-b49c-42bd-932d-af85eafce8c0" containerName="mariadb-database-create" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.397524 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="127158ae-b49c-42bd-932d-af85eafce8c0" containerName="mariadb-database-create" Mar 01 09:27:11 crc kubenswrapper[4792]: E0301 09:27:11.397538 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46d8b4e1-c1b5-468c-b319-84985c525d6a" containerName="mariadb-account-create-update" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.397543 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d8b4e1-c1b5-468c-b319-84985c525d6a" containerName="mariadb-account-create-update" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.397683 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="46d8b4e1-c1b5-468c-b319-84985c525d6a" containerName="mariadb-account-create-update" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.397693 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="87678b56-0909-4735-ad6b-cb992dc86853" containerName="dnsmasq-dns" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.397704 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="272107df-b15b-4c97-b9b0-e865f9a391da" containerName="mariadb-database-create" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.397716 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="58e6fcdd-44b2-4c03-9cf6-a772bd0c3779" containerName="mariadb-account-create-update" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.397725 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5" containerName="mariadb-database-create" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.397734 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="869a99e5-f399-4938-ba59-bbe20e23385b" containerName="mariadb-account-create-update" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.397746 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="127158ae-b49c-42bd-932d-af85eafce8c0" containerName="mariadb-database-create" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.398202 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ks68h" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.400581 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.404510 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d8c5-account-create-update-z4zgs" event={"ID":"58e6fcdd-44b2-4c03-9cf6-a772bd0c3779","Type":"ContainerDied","Data":"4bfc939eefab66aa8b4620743c13ae7fca651fc6f905e4eef02dd74188eabdb0"} Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.404542 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d8c5-account-create-update-z4zgs" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.404557 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bfc939eefab66aa8b4620743c13ae7fca651fc6f905e4eef02dd74188eabdb0" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.407326 4792 generic.go:334] "Generic (PLEG): container finished" podID="87678b56-0909-4735-ad6b-cb992dc86853" containerID="145f85c9e4bed69559d3fa2321198a2801d674c147a18cd06a91ee52950e9894" exitCode=0 Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.407372 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" event={"ID":"87678b56-0909-4735-ad6b-cb992dc86853","Type":"ContainerDied","Data":"145f85c9e4bed69559d3fa2321198a2801d674c147a18cd06a91ee52950e9894"} Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.407392 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" event={"ID":"87678b56-0909-4735-ad6b-cb992dc86853","Type":"ContainerDied","Data":"0a697580492b727fa9bf9603c48fb17f94e586d5d0286904452488d9188bd582"} Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.407408 4792 scope.go:117] "RemoveContainer" containerID="145f85c9e4bed69559d3fa2321198a2801d674c147a18cd06a91ee52950e9894" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.407508 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6444958b7f-j6c8g" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.409070 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vwjrh" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.411305 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8zsss" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.436855 4792 scope.go:117] "RemoveContainer" containerID="dda5ed153c6754e09b06fdc62aa62a423747b40a02478faacd4a8e291bb4a8b7" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.473836 4792 scope.go:117] "RemoveContainer" containerID="145f85c9e4bed69559d3fa2321198a2801d674c147a18cd06a91ee52950e9894" Mar 01 09:27:11 crc kubenswrapper[4792]: E0301 09:27:11.474409 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"145f85c9e4bed69559d3fa2321198a2801d674c147a18cd06a91ee52950e9894\": container with ID starting with 145f85c9e4bed69559d3fa2321198a2801d674c147a18cd06a91ee52950e9894 not found: ID does not exist" containerID="145f85c9e4bed69559d3fa2321198a2801d674c147a18cd06a91ee52950e9894" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.474448 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"145f85c9e4bed69559d3fa2321198a2801d674c147a18cd06a91ee52950e9894"} err="failed to get container status \"145f85c9e4bed69559d3fa2321198a2801d674c147a18cd06a91ee52950e9894\": rpc error: code = NotFound desc = could not find container \"145f85c9e4bed69559d3fa2321198a2801d674c147a18cd06a91ee52950e9894\": container with ID starting with 145f85c9e4bed69559d3fa2321198a2801d674c147a18cd06a91ee52950e9894 not found: ID does not exist" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.474475 4792 scope.go:117] "RemoveContainer" containerID="dda5ed153c6754e09b06fdc62aa62a423747b40a02478faacd4a8e291bb4a8b7" Mar 01 09:27:11 crc kubenswrapper[4792]: E0301 09:27:11.475258 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dda5ed153c6754e09b06fdc62aa62a423747b40a02478faacd4a8e291bb4a8b7\": container with ID starting with dda5ed153c6754e09b06fdc62aa62a423747b40a02478faacd4a8e291bb4a8b7 not found: ID does not exist" containerID="dda5ed153c6754e09b06fdc62aa62a423747b40a02478faacd4a8e291bb4a8b7" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.475282 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dda5ed153c6754e09b06fdc62aa62a423747b40a02478faacd4a8e291bb4a8b7"} err="failed to get container status \"dda5ed153c6754e09b06fdc62aa62a423747b40a02478faacd4a8e291bb4a8b7\": rpc error: code = NotFound desc = could not find container \"dda5ed153c6754e09b06fdc62aa62a423747b40a02478faacd4a8e291bb4a8b7\": container with ID starting with dda5ed153c6754e09b06fdc62aa62a423747b40a02478faacd4a8e291bb4a8b7 not found: ID does not exist" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.480497 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-ks68h"] Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.480696 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8zsss" event={"ID":"272107df-b15b-4c97-b9b0-e865f9a391da","Type":"ContainerDied","Data":"63bb42f2da2fd3c9a15586710b3a6b5b1ba3e77ac05b9db5ef3689f411e43763"} Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.480726 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63bb42f2da2fd3c9a15586710b3a6b5b1ba3e77ac05b9db5ef3689f411e43763" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.480742 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-j6c8g"] Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.492781 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-j6c8g"] Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.505892 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-config-data\") pod \"glance-db-sync-ks68h\" (UID: \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\") " pod="openstack/glance-db-sync-ks68h" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.506005 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xks9p\" (UniqueName: \"kubernetes.io/projected/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-kube-api-access-xks9p\") pod \"glance-db-sync-ks68h\" (UID: \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\") " pod="openstack/glance-db-sync-ks68h" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.506695 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-db-sync-config-data\") pod \"glance-db-sync-ks68h\" (UID: \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\") " pod="openstack/glance-db-sync-ks68h" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.506756 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-combined-ca-bundle\") pod \"glance-db-sync-ks68h\" (UID: \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\") " pod="openstack/glance-db-sync-ks68h" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.608519 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-config-data\") pod \"glance-db-sync-ks68h\" (UID: \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\") " pod="openstack/glance-db-sync-ks68h" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.608613 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xks9p\" (UniqueName: \"kubernetes.io/projected/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-kube-api-access-xks9p\") pod \"glance-db-sync-ks68h\" (UID: \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\") " pod="openstack/glance-db-sync-ks68h" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.608675 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-db-sync-config-data\") pod \"glance-db-sync-ks68h\" (UID: \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\") " pod="openstack/glance-db-sync-ks68h" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.608696 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-combined-ca-bundle\") pod \"glance-db-sync-ks68h\" (UID: \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\") " pod="openstack/glance-db-sync-ks68h" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.613322 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-combined-ca-bundle\") pod \"glance-db-sync-ks68h\" (UID: \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\") " pod="openstack/glance-db-sync-ks68h" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.615961 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-config-data\") pod \"glance-db-sync-ks68h\" (UID: \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\") " pod="openstack/glance-db-sync-ks68h" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.617962 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-db-sync-config-data\") pod \"glance-db-sync-ks68h\" (UID: \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\") " pod="openstack/glance-db-sync-ks68h" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.624460 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xks9p\" (UniqueName: \"kubernetes.io/projected/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-kube-api-access-xks9p\") pod \"glance-db-sync-ks68h\" (UID: \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\") " pod="openstack/glance-db-sync-ks68h" Mar 01 09:27:11 crc kubenswrapper[4792]: I0301 09:27:11.757259 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ks68h" Mar 01 09:27:12 crc kubenswrapper[4792]: I0301 09:27:12.413776 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-ks68h"] Mar 01 09:27:13 crc kubenswrapper[4792]: I0301 09:27:13.425760 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87678b56-0909-4735-ad6b-cb992dc86853" path="/var/lib/kubelet/pods/87678b56-0909-4735-ad6b-cb992dc86853/volumes" Mar 01 09:27:13 crc kubenswrapper[4792]: I0301 09:27:13.430507 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ks68h" event={"ID":"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89","Type":"ContainerStarted","Data":"cb43fcc4ccc58a02f1becab11b8e15aea8e8f7775807952c1c371315885d3f5c"} Mar 01 09:27:14 crc kubenswrapper[4792]: I0301 09:27:14.332726 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-j6lbg"] Mar 01 09:27:14 crc kubenswrapper[4792]: I0301 09:27:14.342470 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-j6lbg"] Mar 01 09:27:14 crc kubenswrapper[4792]: I0301 09:27:14.413160 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-2dgrc"] Mar 01 09:27:14 crc kubenswrapper[4792]: I0301 09:27:14.414397 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2dgrc" Mar 01 09:27:14 crc kubenswrapper[4792]: I0301 09:27:14.421992 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 01 09:27:14 crc kubenswrapper[4792]: I0301 09:27:14.422134 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-2dgrc"] Mar 01 09:27:14 crc kubenswrapper[4792]: I0301 09:27:14.573376 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwrqj\" (UniqueName: \"kubernetes.io/projected/e9e80b4e-b68a-48a2-b0fe-e5cf19e00669-kube-api-access-bwrqj\") pod \"root-account-create-update-2dgrc\" (UID: \"e9e80b4e-b68a-48a2-b0fe-e5cf19e00669\") " pod="openstack/root-account-create-update-2dgrc" Mar 01 09:27:14 crc kubenswrapper[4792]: I0301 09:27:14.573490 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9e80b4e-b68a-48a2-b0fe-e5cf19e00669-operator-scripts\") pod \"root-account-create-update-2dgrc\" (UID: \"e9e80b4e-b68a-48a2-b0fe-e5cf19e00669\") " pod="openstack/root-account-create-update-2dgrc" Mar 01 09:27:14 crc kubenswrapper[4792]: I0301 09:27:14.675477 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9e80b4e-b68a-48a2-b0fe-e5cf19e00669-operator-scripts\") pod \"root-account-create-update-2dgrc\" (UID: \"e9e80b4e-b68a-48a2-b0fe-e5cf19e00669\") " pod="openstack/root-account-create-update-2dgrc" Mar 01 09:27:14 crc kubenswrapper[4792]: I0301 09:27:14.675563 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwrqj\" (UniqueName: \"kubernetes.io/projected/e9e80b4e-b68a-48a2-b0fe-e5cf19e00669-kube-api-access-bwrqj\") pod \"root-account-create-update-2dgrc\" (UID: \"e9e80b4e-b68a-48a2-b0fe-e5cf19e00669\") " pod="openstack/root-account-create-update-2dgrc" Mar 01 09:27:14 crc kubenswrapper[4792]: I0301 09:27:14.676621 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9e80b4e-b68a-48a2-b0fe-e5cf19e00669-operator-scripts\") pod \"root-account-create-update-2dgrc\" (UID: \"e9e80b4e-b68a-48a2-b0fe-e5cf19e00669\") " pod="openstack/root-account-create-update-2dgrc" Mar 01 09:27:14 crc kubenswrapper[4792]: I0301 09:27:14.693815 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwrqj\" (UniqueName: \"kubernetes.io/projected/e9e80b4e-b68a-48a2-b0fe-e5cf19e00669-kube-api-access-bwrqj\") pod \"root-account-create-update-2dgrc\" (UID: \"e9e80b4e-b68a-48a2-b0fe-e5cf19e00669\") " pod="openstack/root-account-create-update-2dgrc" Mar 01 09:27:14 crc kubenswrapper[4792]: I0301 09:27:14.736377 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2dgrc" Mar 01 09:27:15 crc kubenswrapper[4792]: I0301 09:27:15.166269 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-2dgrc"] Mar 01 09:27:15 crc kubenswrapper[4792]: W0301 09:27:15.188084 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9e80b4e_b68a_48a2_b0fe_e5cf19e00669.slice/crio-4f51b274cb581b24cd99a79ceeae3ca3c753bc062a04d1f27a5ca9e221f2dd2b WatchSource:0}: Error finding container 4f51b274cb581b24cd99a79ceeae3ca3c753bc062a04d1f27a5ca9e221f2dd2b: Status 404 returned error can't find the container with id 4f51b274cb581b24cd99a79ceeae3ca3c753bc062a04d1f27a5ca9e221f2dd2b Mar 01 09:27:15 crc kubenswrapper[4792]: I0301 09:27:15.418798 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77225d3a-9f2d-4aaf-8b98-6dc5310db3da" path="/var/lib/kubelet/pods/77225d3a-9f2d-4aaf-8b98-6dc5310db3da/volumes" Mar 01 09:27:15 crc kubenswrapper[4792]: I0301 09:27:15.467628 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2dgrc" event={"ID":"e9e80b4e-b68a-48a2-b0fe-e5cf19e00669","Type":"ContainerStarted","Data":"85f8d1f8a57c04591a9aaccb1305c025dd215cbe1527e2573cb115d042731951"} Mar 01 09:27:15 crc kubenswrapper[4792]: I0301 09:27:15.467686 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2dgrc" event={"ID":"e9e80b4e-b68a-48a2-b0fe-e5cf19e00669","Type":"ContainerStarted","Data":"4f51b274cb581b24cd99a79ceeae3ca3c753bc062a04d1f27a5ca9e221f2dd2b"} Mar 01 09:27:15 crc kubenswrapper[4792]: I0301 09:27:15.490558 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-2dgrc" podStartSLOduration=1.490542509 podStartE2EDuration="1.490542509s" podCreationTimestamp="2026-03-01 09:27:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:27:15.484723271 +0000 UTC m=+1164.726602488" watchObservedRunningTime="2026-03-01 09:27:15.490542509 +0000 UTC m=+1164.732421706" Mar 01 09:27:16 crc kubenswrapper[4792]: I0301 09:27:16.475409 4792 generic.go:334] "Generic (PLEG): container finished" podID="e9e80b4e-b68a-48a2-b0fe-e5cf19e00669" containerID="85f8d1f8a57c04591a9aaccb1305c025dd215cbe1527e2573cb115d042731951" exitCode=0 Mar 01 09:27:16 crc kubenswrapper[4792]: I0301 09:27:16.475688 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2dgrc" event={"ID":"e9e80b4e-b68a-48a2-b0fe-e5cf19e00669","Type":"ContainerDied","Data":"85f8d1f8a57c04591a9aaccb1305c025dd215cbe1527e2573cb115d042731951"} Mar 01 09:27:20 crc kubenswrapper[4792]: I0301 09:27:20.431385 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 01 09:27:22 crc kubenswrapper[4792]: I0301 09:27:22.534615 4792 generic.go:334] "Generic (PLEG): container finished" podID="6252a079-917c-46e8-a848-10569e1e057e" containerID="81c1c2615bd05b6e2f8a23b6d892f8335b3c7a5c117575ce3ed245f2faa7543f" exitCode=0 Mar 01 09:27:22 crc kubenswrapper[4792]: I0301 09:27:22.534659 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6252a079-917c-46e8-a848-10569e1e057e","Type":"ContainerDied","Data":"81c1c2615bd05b6e2f8a23b6d892f8335b3c7a5c117575ce3ed245f2faa7543f"} Mar 01 09:27:23 crc kubenswrapper[4792]: I0301 09:27:23.546302 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2dgrc" event={"ID":"e9e80b4e-b68a-48a2-b0fe-e5cf19e00669","Type":"ContainerDied","Data":"4f51b274cb581b24cd99a79ceeae3ca3c753bc062a04d1f27a5ca9e221f2dd2b"} Mar 01 09:27:23 crc kubenswrapper[4792]: I0301 09:27:23.547639 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f51b274cb581b24cd99a79ceeae3ca3c753bc062a04d1f27a5ca9e221f2dd2b" Mar 01 09:27:23 crc kubenswrapper[4792]: I0301 09:27:23.549464 4792 generic.go:334] "Generic (PLEG): container finished" podID="2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" containerID="6ee0d7e342e549ff7a0c75a829bbad1f0458089ec98c553779c523b140c0f36b" exitCode=0 Mar 01 09:27:23 crc kubenswrapper[4792]: I0301 09:27:23.549514 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24","Type":"ContainerDied","Data":"6ee0d7e342e549ff7a0c75a829bbad1f0458089ec98c553779c523b140c0f36b"} Mar 01 09:27:23 crc kubenswrapper[4792]: I0301 09:27:23.569266 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2dgrc" Mar 01 09:27:23 crc kubenswrapper[4792]: I0301 09:27:23.743009 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwrqj\" (UniqueName: \"kubernetes.io/projected/e9e80b4e-b68a-48a2-b0fe-e5cf19e00669-kube-api-access-bwrqj\") pod \"e9e80b4e-b68a-48a2-b0fe-e5cf19e00669\" (UID: \"e9e80b4e-b68a-48a2-b0fe-e5cf19e00669\") " Mar 01 09:27:23 crc kubenswrapper[4792]: I0301 09:27:23.743168 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9e80b4e-b68a-48a2-b0fe-e5cf19e00669-operator-scripts\") pod \"e9e80b4e-b68a-48a2-b0fe-e5cf19e00669\" (UID: \"e9e80b4e-b68a-48a2-b0fe-e5cf19e00669\") " Mar 01 09:27:23 crc kubenswrapper[4792]: I0301 09:27:23.744764 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9e80b4e-b68a-48a2-b0fe-e5cf19e00669-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e9e80b4e-b68a-48a2-b0fe-e5cf19e00669" (UID: "e9e80b4e-b68a-48a2-b0fe-e5cf19e00669"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:23 crc kubenswrapper[4792]: I0301 09:27:23.748681 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9e80b4e-b68a-48a2-b0fe-e5cf19e00669-kube-api-access-bwrqj" (OuterVolumeSpecName: "kube-api-access-bwrqj") pod "e9e80b4e-b68a-48a2-b0fe-e5cf19e00669" (UID: "e9e80b4e-b68a-48a2-b0fe-e5cf19e00669"). InnerVolumeSpecName "kube-api-access-bwrqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:23 crc kubenswrapper[4792]: I0301 09:27:23.845550 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwrqj\" (UniqueName: \"kubernetes.io/projected/e9e80b4e-b68a-48a2-b0fe-e5cf19e00669-kube-api-access-bwrqj\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:23 crc kubenswrapper[4792]: I0301 09:27:23.845594 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9e80b4e-b68a-48a2-b0fe-e5cf19e00669-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:24 crc kubenswrapper[4792]: I0301 09:27:24.558513 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ks68h" event={"ID":"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89","Type":"ContainerStarted","Data":"01f9646a0afc7be0a0075cd21cc75eb15ebfcd51abc1ad1bd7ae6a925a4bfdd3"} Mar 01 09:27:24 crc kubenswrapper[4792]: I0301 09:27:24.560892 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6252a079-917c-46e8-a848-10569e1e057e","Type":"ContainerStarted","Data":"e872a8250debe35b2b405169cd43bdbc962c34739bd277e35d8038f3fa166251"} Mar 01 09:27:24 crc kubenswrapper[4792]: I0301 09:27:24.561193 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 01 09:27:24 crc kubenswrapper[4792]: I0301 09:27:24.564305 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2dgrc" Mar 01 09:27:24 crc kubenswrapper[4792]: I0301 09:27:24.564684 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24","Type":"ContainerStarted","Data":"763a17ce168e713296e67217a330ca93fd37d2fe6e80cda59899f22b0afab4c5"} Mar 01 09:27:24 crc kubenswrapper[4792]: I0301 09:27:24.564897 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:27:24 crc kubenswrapper[4792]: I0301 09:27:24.608148 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-ks68h" podStartSLOduration=2.439438714 podStartE2EDuration="13.608130848s" podCreationTimestamp="2026-03-01 09:27:11 +0000 UTC" firstStartedPulling="2026-03-01 09:27:12.425567358 +0000 UTC m=+1161.667446555" lastFinishedPulling="2026-03-01 09:27:23.594259492 +0000 UTC m=+1172.836138689" observedRunningTime="2026-03-01 09:27:24.589303125 +0000 UTC m=+1173.831182322" watchObservedRunningTime="2026-03-01 09:27:24.608130848 +0000 UTC m=+1173.850010045" Mar 01 09:27:24 crc kubenswrapper[4792]: I0301 09:27:24.622973 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.097113854 podStartE2EDuration="1m3.622956661s" podCreationTimestamp="2026-03-01 09:26:21 +0000 UTC" firstStartedPulling="2026-03-01 09:26:23.605844233 +0000 UTC m=+1112.847723430" lastFinishedPulling="2026-03-01 09:26:49.13168704 +0000 UTC m=+1138.373566237" observedRunningTime="2026-03-01 09:27:24.619377744 +0000 UTC m=+1173.861256941" watchObservedRunningTime="2026-03-01 09:27:24.622956661 +0000 UTC m=+1173.864835858" Mar 01 09:27:24 crc kubenswrapper[4792]: I0301 09:27:24.648491 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.442271778 podStartE2EDuration="1m3.648474688s" podCreationTimestamp="2026-03-01 09:26:21 +0000 UTC" firstStartedPulling="2026-03-01 09:26:23.864243901 +0000 UTC m=+1113.106123098" lastFinishedPulling="2026-03-01 09:26:49.070446821 +0000 UTC m=+1138.312326008" observedRunningTime="2026-03-01 09:27:24.644749836 +0000 UTC m=+1173.886629033" watchObservedRunningTime="2026-03-01 09:27:24.648474688 +0000 UTC m=+1173.890353885" Mar 01 09:27:26 crc kubenswrapper[4792]: I0301 09:27:26.618382 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-mpvqc" podUID="d50ee3b1-4f97-4644-802d-04c85d9c3abc" containerName="ovn-controller" probeResult="failure" output=< Mar 01 09:27:26 crc kubenswrapper[4792]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 01 09:27:26 crc kubenswrapper[4792]: > Mar 01 09:27:26 crc kubenswrapper[4792]: I0301 09:27:26.657584 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:27:26 crc kubenswrapper[4792]: I0301 09:27:26.659855 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-nfzrr" Mar 01 09:27:26 crc kubenswrapper[4792]: I0301 09:27:26.899041 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-mpvqc-config-vmwpv"] Mar 01 09:27:26 crc kubenswrapper[4792]: E0301 09:27:26.899451 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9e80b4e-b68a-48a2-b0fe-e5cf19e00669" containerName="mariadb-account-create-update" Mar 01 09:27:26 crc kubenswrapper[4792]: I0301 09:27:26.899472 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9e80b4e-b68a-48a2-b0fe-e5cf19e00669" containerName="mariadb-account-create-update" Mar 01 09:27:26 crc kubenswrapper[4792]: I0301 09:27:26.899625 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9e80b4e-b68a-48a2-b0fe-e5cf19e00669" containerName="mariadb-account-create-update" Mar 01 09:27:26 crc kubenswrapper[4792]: I0301 09:27:26.900126 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:26 crc kubenswrapper[4792]: I0301 09:27:26.914287 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 01 09:27:26 crc kubenswrapper[4792]: I0301 09:27:26.918868 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mpvqc-config-vmwpv"] Mar 01 09:27:26 crc kubenswrapper[4792]: I0301 09:27:26.996870 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-var-run\") pod \"ovn-controller-mpvqc-config-vmwpv\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:26 crc kubenswrapper[4792]: I0301 09:27:26.996940 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88jl5\" (UniqueName: \"kubernetes.io/projected/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-kube-api-access-88jl5\") pod \"ovn-controller-mpvqc-config-vmwpv\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:26 crc kubenswrapper[4792]: I0301 09:27:26.996985 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-var-run-ovn\") pod \"ovn-controller-mpvqc-config-vmwpv\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:26 crc kubenswrapper[4792]: I0301 09:27:26.997023 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-var-log-ovn\") pod \"ovn-controller-mpvqc-config-vmwpv\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:26 crc kubenswrapper[4792]: I0301 09:27:26.997060 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-additional-scripts\") pod \"ovn-controller-mpvqc-config-vmwpv\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:26 crc kubenswrapper[4792]: I0301 09:27:26.997087 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-scripts\") pod \"ovn-controller-mpvqc-config-vmwpv\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:27 crc kubenswrapper[4792]: I0301 09:27:27.098656 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88jl5\" (UniqueName: \"kubernetes.io/projected/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-kube-api-access-88jl5\") pod \"ovn-controller-mpvqc-config-vmwpv\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:27 crc kubenswrapper[4792]: I0301 09:27:27.099129 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-var-run-ovn\") pod \"ovn-controller-mpvqc-config-vmwpv\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:27 crc kubenswrapper[4792]: I0301 09:27:27.099480 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-var-log-ovn\") pod \"ovn-controller-mpvqc-config-vmwpv\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:27 crc kubenswrapper[4792]: I0301 09:27:27.099649 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-additional-scripts\") pod \"ovn-controller-mpvqc-config-vmwpv\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:27 crc kubenswrapper[4792]: I0301 09:27:27.100402 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-scripts\") pod \"ovn-controller-mpvqc-config-vmwpv\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:27 crc kubenswrapper[4792]: I0301 09:27:27.102090 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-var-run\") pod \"ovn-controller-mpvqc-config-vmwpv\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:27 crc kubenswrapper[4792]: I0301 09:27:27.100359 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-additional-scripts\") pod \"ovn-controller-mpvqc-config-vmwpv\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:27 crc kubenswrapper[4792]: I0301 09:27:27.099605 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-var-log-ovn\") pod \"ovn-controller-mpvqc-config-vmwpv\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:27 crc kubenswrapper[4792]: I0301 09:27:27.102000 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-scripts\") pod \"ovn-controller-mpvqc-config-vmwpv\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:27 crc kubenswrapper[4792]: I0301 09:27:27.099433 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-var-run-ovn\") pod \"ovn-controller-mpvqc-config-vmwpv\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:27 crc kubenswrapper[4792]: I0301 09:27:27.102506 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-var-run\") pod \"ovn-controller-mpvqc-config-vmwpv\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:27 crc kubenswrapper[4792]: I0301 09:27:27.117586 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88jl5\" (UniqueName: \"kubernetes.io/projected/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-kube-api-access-88jl5\") pod \"ovn-controller-mpvqc-config-vmwpv\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:27 crc kubenswrapper[4792]: I0301 09:27:27.223738 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:27 crc kubenswrapper[4792]: I0301 09:27:27.509182 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mpvqc-config-vmwpv"] Mar 01 09:27:27 crc kubenswrapper[4792]: W0301 09:27:27.535205 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2ec422a_59ee_429e_8e59_5d08e22bc9a6.slice/crio-fa73e573aee706c57ffb03f318a435a06c59a627a88dfec63bde950f977943c5 WatchSource:0}: Error finding container fa73e573aee706c57ffb03f318a435a06c59a627a88dfec63bde950f977943c5: Status 404 returned error can't find the container with id fa73e573aee706c57ffb03f318a435a06c59a627a88dfec63bde950f977943c5 Mar 01 09:27:27 crc kubenswrapper[4792]: I0301 09:27:27.592752 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mpvqc-config-vmwpv" event={"ID":"e2ec422a-59ee-429e-8e59-5d08e22bc9a6","Type":"ContainerStarted","Data":"fa73e573aee706c57ffb03f318a435a06c59a627a88dfec63bde950f977943c5"} Mar 01 09:27:28 crc kubenswrapper[4792]: I0301 09:27:28.601138 4792 generic.go:334] "Generic (PLEG): container finished" podID="e2ec422a-59ee-429e-8e59-5d08e22bc9a6" containerID="68eabc969b4329c81ee454f5c339af1b09a491b6cf0b1ab092fc279d1ef9e440" exitCode=0 Mar 01 09:27:28 crc kubenswrapper[4792]: I0301 09:27:28.601184 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mpvqc-config-vmwpv" event={"ID":"e2ec422a-59ee-429e-8e59-5d08e22bc9a6","Type":"ContainerDied","Data":"68eabc969b4329c81ee454f5c339af1b09a491b6cf0b1ab092fc279d1ef9e440"} Mar 01 09:27:29 crc kubenswrapper[4792]: I0301 09:27:29.932267 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.052315 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88jl5\" (UniqueName: \"kubernetes.io/projected/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-kube-api-access-88jl5\") pod \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.052671 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-var-log-ovn\") pod \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.052700 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-var-run\") pod \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.052740 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-additional-scripts\") pod \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.052789 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-var-run-ovn\") pod \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.052856 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-scripts\") pod \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\" (UID: \"e2ec422a-59ee-429e-8e59-5d08e22bc9a6\") " Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.053992 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "e2ec422a-59ee-429e-8e59-5d08e22bc9a6" (UID: "e2ec422a-59ee-429e-8e59-5d08e22bc9a6"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.054039 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-var-run" (OuterVolumeSpecName: "var-run") pod "e2ec422a-59ee-429e-8e59-5d08e22bc9a6" (UID: "e2ec422a-59ee-429e-8e59-5d08e22bc9a6"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.054329 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "e2ec422a-59ee-429e-8e59-5d08e22bc9a6" (UID: "e2ec422a-59ee-429e-8e59-5d08e22bc9a6"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.054359 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-scripts" (OuterVolumeSpecName: "scripts") pod "e2ec422a-59ee-429e-8e59-5d08e22bc9a6" (UID: "e2ec422a-59ee-429e-8e59-5d08e22bc9a6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.054368 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "e2ec422a-59ee-429e-8e59-5d08e22bc9a6" (UID: "e2ec422a-59ee-429e-8e59-5d08e22bc9a6"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.075082 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-kube-api-access-88jl5" (OuterVolumeSpecName: "kube-api-access-88jl5") pod "e2ec422a-59ee-429e-8e59-5d08e22bc9a6" (UID: "e2ec422a-59ee-429e-8e59-5d08e22bc9a6"). InnerVolumeSpecName "kube-api-access-88jl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.154491 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.154527 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88jl5\" (UniqueName: \"kubernetes.io/projected/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-kube-api-access-88jl5\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.154539 4792 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.154559 4792 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-var-run\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.154568 4792 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.154576 4792 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e2ec422a-59ee-429e-8e59-5d08e22bc9a6-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.617208 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mpvqc-config-vmwpv" event={"ID":"e2ec422a-59ee-429e-8e59-5d08e22bc9a6","Type":"ContainerDied","Data":"fa73e573aee706c57ffb03f318a435a06c59a627a88dfec63bde950f977943c5"} Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.617254 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa73e573aee706c57ffb03f318a435a06c59a627a88dfec63bde950f977943c5" Mar 01 09:27:30 crc kubenswrapper[4792]: I0301 09:27:30.617317 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mpvqc-config-vmwpv" Mar 01 09:27:31 crc kubenswrapper[4792]: I0301 09:27:31.081774 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-mpvqc-config-vmwpv"] Mar 01 09:27:31 crc kubenswrapper[4792]: I0301 09:27:31.091872 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-mpvqc-config-vmwpv"] Mar 01 09:27:31 crc kubenswrapper[4792]: I0301 09:27:31.416698 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2ec422a-59ee-429e-8e59-5d08e22bc9a6" path="/var/lib/kubelet/pods/e2ec422a-59ee-429e-8e59-5d08e22bc9a6/volumes" Mar 01 09:27:31 crc kubenswrapper[4792]: I0301 09:27:31.661226 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-mpvqc" Mar 01 09:27:33 crc kubenswrapper[4792]: I0301 09:27:33.233381 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="6252a079-917c-46e8-a848-10569e1e057e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Mar 01 09:27:33 crc kubenswrapper[4792]: I0301 09:27:33.655043 4792 generic.go:334] "Generic (PLEG): container finished" podID="72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89" containerID="01f9646a0afc7be0a0075cd21cc75eb15ebfcd51abc1ad1bd7ae6a925a4bfdd3" exitCode=0 Mar 01 09:27:33 crc kubenswrapper[4792]: I0301 09:27:33.655092 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ks68h" event={"ID":"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89","Type":"ContainerDied","Data":"01f9646a0afc7be0a0075cd21cc75eb15ebfcd51abc1ad1bd7ae6a925a4bfdd3"} Mar 01 09:27:35 crc kubenswrapper[4792]: I0301 09:27:35.019641 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ks68h" Mar 01 09:27:35 crc kubenswrapper[4792]: I0301 09:27:35.128779 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-db-sync-config-data\") pod \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\" (UID: \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\") " Mar 01 09:27:35 crc kubenswrapper[4792]: I0301 09:27:35.128980 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xks9p\" (UniqueName: \"kubernetes.io/projected/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-kube-api-access-xks9p\") pod \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\" (UID: \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\") " Mar 01 09:27:35 crc kubenswrapper[4792]: I0301 09:27:35.129058 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-config-data\") pod \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\" (UID: \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\") " Mar 01 09:27:35 crc kubenswrapper[4792]: I0301 09:27:35.129136 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-combined-ca-bundle\") pod \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\" (UID: \"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89\") " Mar 01 09:27:35 crc kubenswrapper[4792]: I0301 09:27:35.137613 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89" (UID: "72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:27:35 crc kubenswrapper[4792]: I0301 09:27:35.139185 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-kube-api-access-xks9p" (OuterVolumeSpecName: "kube-api-access-xks9p") pod "72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89" (UID: "72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89"). InnerVolumeSpecName "kube-api-access-xks9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:35 crc kubenswrapper[4792]: I0301 09:27:35.156377 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89" (UID: "72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:27:35 crc kubenswrapper[4792]: I0301 09:27:35.175238 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-config-data" (OuterVolumeSpecName: "config-data") pod "72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89" (UID: "72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:27:35 crc kubenswrapper[4792]: I0301 09:27:35.231809 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:35 crc kubenswrapper[4792]: I0301 09:27:35.231852 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:35 crc kubenswrapper[4792]: I0301 09:27:35.231867 4792 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:35 crc kubenswrapper[4792]: I0301 09:27:35.231880 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xks9p\" (UniqueName: \"kubernetes.io/projected/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89-kube-api-access-xks9p\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:35 crc kubenswrapper[4792]: I0301 09:27:35.669161 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ks68h" event={"ID":"72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89","Type":"ContainerDied","Data":"cb43fcc4ccc58a02f1becab11b8e15aea8e8f7775807952c1c371315885d3f5c"} Mar 01 09:27:35 crc kubenswrapper[4792]: I0301 09:27:35.669454 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb43fcc4ccc58a02f1becab11b8e15aea8e8f7775807952c1c371315885d3f5c" Mar 01 09:27:35 crc kubenswrapper[4792]: I0301 09:27:35.669931 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ks68h" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.097649 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6958f867f9-p8x5s"] Mar 01 09:27:36 crc kubenswrapper[4792]: E0301 09:27:36.097990 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89" containerName="glance-db-sync" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.098004 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89" containerName="glance-db-sync" Mar 01 09:27:36 crc kubenswrapper[4792]: E0301 09:27:36.098025 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2ec422a-59ee-429e-8e59-5d08e22bc9a6" containerName="ovn-config" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.098032 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2ec422a-59ee-429e-8e59-5d08e22bc9a6" containerName="ovn-config" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.098191 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2ec422a-59ee-429e-8e59-5d08e22bc9a6" containerName="ovn-config" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.098204 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89" containerName="glance-db-sync" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.099008 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.129151 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6958f867f9-p8x5s"] Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.144119 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-config\") pod \"dnsmasq-dns-6958f867f9-p8x5s\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.144179 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77bc7\" (UniqueName: \"kubernetes.io/projected/667fff68-7113-4dfe-86b4-34b80b41d326-kube-api-access-77bc7\") pod \"dnsmasq-dns-6958f867f9-p8x5s\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.144207 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-dns-svc\") pod \"dnsmasq-dns-6958f867f9-p8x5s\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.144239 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-ovsdbserver-sb\") pod \"dnsmasq-dns-6958f867f9-p8x5s\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.144360 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-ovsdbserver-nb\") pod \"dnsmasq-dns-6958f867f9-p8x5s\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.245855 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-config\") pod \"dnsmasq-dns-6958f867f9-p8x5s\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.245933 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77bc7\" (UniqueName: \"kubernetes.io/projected/667fff68-7113-4dfe-86b4-34b80b41d326-kube-api-access-77bc7\") pod \"dnsmasq-dns-6958f867f9-p8x5s\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.245956 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-dns-svc\") pod \"dnsmasq-dns-6958f867f9-p8x5s\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.245985 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-ovsdbserver-sb\") pod \"dnsmasq-dns-6958f867f9-p8x5s\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.246038 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-ovsdbserver-nb\") pod \"dnsmasq-dns-6958f867f9-p8x5s\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.246978 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-config\") pod \"dnsmasq-dns-6958f867f9-p8x5s\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.247246 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-ovsdbserver-nb\") pod \"dnsmasq-dns-6958f867f9-p8x5s\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.247340 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-dns-svc\") pod \"dnsmasq-dns-6958f867f9-p8x5s\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.247426 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-ovsdbserver-sb\") pod \"dnsmasq-dns-6958f867f9-p8x5s\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.271945 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77bc7\" (UniqueName: \"kubernetes.io/projected/667fff68-7113-4dfe-86b4-34b80b41d326-kube-api-access-77bc7\") pod \"dnsmasq-dns-6958f867f9-p8x5s\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.416575 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:36 crc kubenswrapper[4792]: I0301 09:27:36.859532 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6958f867f9-p8x5s"] Mar 01 09:27:37 crc kubenswrapper[4792]: E0301 09:27:37.231876 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod667fff68_7113_4dfe_86b4_34b80b41d326.slice/crio-e23b7bf7394271fedc5e4abedff3f86bcbcc1a5fe82a8164474dcf3fb06696d9.scope\": RecentStats: unable to find data in memory cache]" Mar 01 09:27:37 crc kubenswrapper[4792]: I0301 09:27:37.685432 4792 generic.go:334] "Generic (PLEG): container finished" podID="667fff68-7113-4dfe-86b4-34b80b41d326" containerID="e23b7bf7394271fedc5e4abedff3f86bcbcc1a5fe82a8164474dcf3fb06696d9" exitCode=0 Mar 01 09:27:37 crc kubenswrapper[4792]: I0301 09:27:37.685491 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" event={"ID":"667fff68-7113-4dfe-86b4-34b80b41d326","Type":"ContainerDied","Data":"e23b7bf7394271fedc5e4abedff3f86bcbcc1a5fe82a8164474dcf3fb06696d9"} Mar 01 09:27:37 crc kubenswrapper[4792]: I0301 09:27:37.685768 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" event={"ID":"667fff68-7113-4dfe-86b4-34b80b41d326","Type":"ContainerStarted","Data":"386c10f9f902b49dfdcc7e63fa9588772c851427627a00112a847f427ef0dd79"} Mar 01 09:27:38 crc kubenswrapper[4792]: I0301 09:27:38.693854 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" event={"ID":"667fff68-7113-4dfe-86b4-34b80b41d326","Type":"ContainerStarted","Data":"b035d55e8df5d946580d2dfd8c4b3f7b4f07c6856584d562fa0676ed055fd3e5"} Mar 01 09:27:38 crc kubenswrapper[4792]: I0301 09:27:38.694346 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:38 crc kubenswrapper[4792]: I0301 09:27:38.719565 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" podStartSLOduration=2.719540234 podStartE2EDuration="2.719540234s" podCreationTimestamp="2026-03-01 09:27:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:27:38.709862596 +0000 UTC m=+1187.951741813" watchObservedRunningTime="2026-03-01 09:27:38.719540234 +0000 UTC m=+1187.961419441" Mar 01 09:27:42 crc kubenswrapper[4792]: I0301 09:27:42.930111 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:27:43 crc kubenswrapper[4792]: I0301 09:27:43.236215 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 01 09:27:44 crc kubenswrapper[4792]: I0301 09:27:44.838543 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-t8rmw"] Mar 01 09:27:44 crc kubenswrapper[4792]: I0301 09:27:44.839679 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-t8rmw" Mar 01 09:27:44 crc kubenswrapper[4792]: I0301 09:27:44.852018 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-t8rmw"] Mar 01 09:27:44 crc kubenswrapper[4792]: I0301 09:27:44.947800 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-8714-account-create-update-wzssg"] Mar 01 09:27:44 crc kubenswrapper[4792]: I0301 09:27:44.950941 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8714-account-create-update-wzssg" Mar 01 09:27:44 crc kubenswrapper[4792]: I0301 09:27:44.952863 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 01 09:27:44 crc kubenswrapper[4792]: I0301 09:27:44.975361 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8714-account-create-update-wzssg"] Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.006381 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkfk2\" (UniqueName: \"kubernetes.io/projected/f0b42afb-2954-442e-bc91-4c8275a4d2fd-kube-api-access-nkfk2\") pod \"cinder-db-create-t8rmw\" (UID: \"f0b42afb-2954-442e-bc91-4c8275a4d2fd\") " pod="openstack/cinder-db-create-t8rmw" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.006452 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b715bb3f-b181-4614-85c5-9155286ce80c-operator-scripts\") pod \"cinder-8714-account-create-update-wzssg\" (UID: \"b715bb3f-b181-4614-85c5-9155286ce80c\") " pod="openstack/cinder-8714-account-create-update-wzssg" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.006488 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djxnh\" (UniqueName: \"kubernetes.io/projected/b715bb3f-b181-4614-85c5-9155286ce80c-kube-api-access-djxnh\") pod \"cinder-8714-account-create-update-wzssg\" (UID: \"b715bb3f-b181-4614-85c5-9155286ce80c\") " pod="openstack/cinder-8714-account-create-update-wzssg" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.006516 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0b42afb-2954-442e-bc91-4c8275a4d2fd-operator-scripts\") pod \"cinder-db-create-t8rmw\" (UID: \"f0b42afb-2954-442e-bc91-4c8275a4d2fd\") " pod="openstack/cinder-db-create-t8rmw" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.017617 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-kv8gv"] Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.018659 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-kv8gv" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.035322 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-kv8gv"] Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.107365 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgch2\" (UniqueName: \"kubernetes.io/projected/bd689802-7b27-463e-a155-ed837e8594e6-kube-api-access-lgch2\") pod \"barbican-db-create-kv8gv\" (UID: \"bd689802-7b27-463e-a155-ed837e8594e6\") " pod="openstack/barbican-db-create-kv8gv" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.107412 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0b42afb-2954-442e-bc91-4c8275a4d2fd-operator-scripts\") pod \"cinder-db-create-t8rmw\" (UID: \"f0b42afb-2954-442e-bc91-4c8275a4d2fd\") " pod="openstack/cinder-db-create-t8rmw" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.107465 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd689802-7b27-463e-a155-ed837e8594e6-operator-scripts\") pod \"barbican-db-create-kv8gv\" (UID: \"bd689802-7b27-463e-a155-ed837e8594e6\") " pod="openstack/barbican-db-create-kv8gv" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.108454 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0b42afb-2954-442e-bc91-4c8275a4d2fd-operator-scripts\") pod \"cinder-db-create-t8rmw\" (UID: \"f0b42afb-2954-442e-bc91-4c8275a4d2fd\") " pod="openstack/cinder-db-create-t8rmw" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.108585 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkfk2\" (UniqueName: \"kubernetes.io/projected/f0b42afb-2954-442e-bc91-4c8275a4d2fd-kube-api-access-nkfk2\") pod \"cinder-db-create-t8rmw\" (UID: \"f0b42afb-2954-442e-bc91-4c8275a4d2fd\") " pod="openstack/cinder-db-create-t8rmw" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.108774 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b715bb3f-b181-4614-85c5-9155286ce80c-operator-scripts\") pod \"cinder-8714-account-create-update-wzssg\" (UID: \"b715bb3f-b181-4614-85c5-9155286ce80c\") " pod="openstack/cinder-8714-account-create-update-wzssg" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.108866 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djxnh\" (UniqueName: \"kubernetes.io/projected/b715bb3f-b181-4614-85c5-9155286ce80c-kube-api-access-djxnh\") pod \"cinder-8714-account-create-update-wzssg\" (UID: \"b715bb3f-b181-4614-85c5-9155286ce80c\") " pod="openstack/cinder-8714-account-create-update-wzssg" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.109835 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b715bb3f-b181-4614-85c5-9155286ce80c-operator-scripts\") pod \"cinder-8714-account-create-update-wzssg\" (UID: \"b715bb3f-b181-4614-85c5-9155286ce80c\") " pod="openstack/cinder-8714-account-create-update-wzssg" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.119262 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-71d5-account-create-update-mjs9k"] Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.120417 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-71d5-account-create-update-mjs9k" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.122974 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.149126 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djxnh\" (UniqueName: \"kubernetes.io/projected/b715bb3f-b181-4614-85c5-9155286ce80c-kube-api-access-djxnh\") pod \"cinder-8714-account-create-update-wzssg\" (UID: \"b715bb3f-b181-4614-85c5-9155286ce80c\") " pod="openstack/cinder-8714-account-create-update-wzssg" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.149917 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkfk2\" (UniqueName: \"kubernetes.io/projected/f0b42afb-2954-442e-bc91-4c8275a4d2fd-kube-api-access-nkfk2\") pod \"cinder-db-create-t8rmw\" (UID: \"f0b42afb-2954-442e-bc91-4c8275a4d2fd\") " pod="openstack/cinder-db-create-t8rmw" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.155075 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-t8rmw" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.174393 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-71d5-account-create-update-mjs9k"] Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.215249 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-jsld8"] Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.218605 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jsld8" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.232324 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.232510 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-fr9vh" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.232775 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.233100 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.235015 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scqkn\" (UniqueName: \"kubernetes.io/projected/465282ce-1312-4cb6-ae89-de6ada48a901-kube-api-access-scqkn\") pod \"keystone-db-sync-jsld8\" (UID: \"465282ce-1312-4cb6-ae89-de6ada48a901\") " pod="openstack/keystone-db-sync-jsld8" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.235108 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgch2\" (UniqueName: \"kubernetes.io/projected/bd689802-7b27-463e-a155-ed837e8594e6-kube-api-access-lgch2\") pod \"barbican-db-create-kv8gv\" (UID: \"bd689802-7b27-463e-a155-ed837e8594e6\") " pod="openstack/barbican-db-create-kv8gv" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.235138 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/465282ce-1312-4cb6-ae89-de6ada48a901-combined-ca-bundle\") pod \"keystone-db-sync-jsld8\" (UID: \"465282ce-1312-4cb6-ae89-de6ada48a901\") " pod="openstack/keystone-db-sync-jsld8" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.235165 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/465282ce-1312-4cb6-ae89-de6ada48a901-config-data\") pod \"keystone-db-sync-jsld8\" (UID: \"465282ce-1312-4cb6-ae89-de6ada48a901\") " pod="openstack/keystone-db-sync-jsld8" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.235187 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dabb3d2e-57fa-4ad3-9f3b-b85e0b670650-operator-scripts\") pod \"barbican-71d5-account-create-update-mjs9k\" (UID: \"dabb3d2e-57fa-4ad3-9f3b-b85e0b670650\") " pod="openstack/barbican-71d5-account-create-update-mjs9k" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.235217 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd689802-7b27-463e-a155-ed837e8594e6-operator-scripts\") pod \"barbican-db-create-kv8gv\" (UID: \"bd689802-7b27-463e-a155-ed837e8594e6\") " pod="openstack/barbican-db-create-kv8gv" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.235238 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-549vc\" (UniqueName: \"kubernetes.io/projected/dabb3d2e-57fa-4ad3-9f3b-b85e0b670650-kube-api-access-549vc\") pod \"barbican-71d5-account-create-update-mjs9k\" (UID: \"dabb3d2e-57fa-4ad3-9f3b-b85e0b670650\") " pod="openstack/barbican-71d5-account-create-update-mjs9k" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.236173 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd689802-7b27-463e-a155-ed837e8594e6-operator-scripts\") pod \"barbican-db-create-kv8gv\" (UID: \"bd689802-7b27-463e-a155-ed837e8594e6\") " pod="openstack/barbican-db-create-kv8gv" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.237628 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-jsld8"] Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.248982 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-zxh6d"] Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.249859 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zxh6d" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.270173 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8714-account-create-update-wzssg" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.275464 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgch2\" (UniqueName: \"kubernetes.io/projected/bd689802-7b27-463e-a155-ed837e8594e6-kube-api-access-lgch2\") pod \"barbican-db-create-kv8gv\" (UID: \"bd689802-7b27-463e-a155-ed837e8594e6\") " pod="openstack/barbican-db-create-kv8gv" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.287039 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-zxh6d"] Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.332253 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-kv8gv" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.339534 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/465282ce-1312-4cb6-ae89-de6ada48a901-combined-ca-bundle\") pod \"keystone-db-sync-jsld8\" (UID: \"465282ce-1312-4cb6-ae89-de6ada48a901\") " pod="openstack/keystone-db-sync-jsld8" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.339586 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/465282ce-1312-4cb6-ae89-de6ada48a901-config-data\") pod \"keystone-db-sync-jsld8\" (UID: \"465282ce-1312-4cb6-ae89-de6ada48a901\") " pod="openstack/keystone-db-sync-jsld8" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.339611 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dabb3d2e-57fa-4ad3-9f3b-b85e0b670650-operator-scripts\") pod \"barbican-71d5-account-create-update-mjs9k\" (UID: \"dabb3d2e-57fa-4ad3-9f3b-b85e0b670650\") " pod="openstack/barbican-71d5-account-create-update-mjs9k" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.339647 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-549vc\" (UniqueName: \"kubernetes.io/projected/dabb3d2e-57fa-4ad3-9f3b-b85e0b670650-kube-api-access-549vc\") pod \"barbican-71d5-account-create-update-mjs9k\" (UID: \"dabb3d2e-57fa-4ad3-9f3b-b85e0b670650\") " pod="openstack/barbican-71d5-account-create-update-mjs9k" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.339691 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scqkn\" (UniqueName: \"kubernetes.io/projected/465282ce-1312-4cb6-ae89-de6ada48a901-kube-api-access-scqkn\") pod \"keystone-db-sync-jsld8\" (UID: \"465282ce-1312-4cb6-ae89-de6ada48a901\") " pod="openstack/keystone-db-sync-jsld8" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.341994 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dabb3d2e-57fa-4ad3-9f3b-b85e0b670650-operator-scripts\") pod \"barbican-71d5-account-create-update-mjs9k\" (UID: \"dabb3d2e-57fa-4ad3-9f3b-b85e0b670650\") " pod="openstack/barbican-71d5-account-create-update-mjs9k" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.344906 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/465282ce-1312-4cb6-ae89-de6ada48a901-combined-ca-bundle\") pod \"keystone-db-sync-jsld8\" (UID: \"465282ce-1312-4cb6-ae89-de6ada48a901\") " pod="openstack/keystone-db-sync-jsld8" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.348098 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/465282ce-1312-4cb6-ae89-de6ada48a901-config-data\") pod \"keystone-db-sync-jsld8\" (UID: \"465282ce-1312-4cb6-ae89-de6ada48a901\") " pod="openstack/keystone-db-sync-jsld8" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.363807 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-e676-account-create-update-5ntgh"] Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.364745 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e676-account-create-update-5ntgh" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.365873 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-549vc\" (UniqueName: \"kubernetes.io/projected/dabb3d2e-57fa-4ad3-9f3b-b85e0b670650-kube-api-access-549vc\") pod \"barbican-71d5-account-create-update-mjs9k\" (UID: \"dabb3d2e-57fa-4ad3-9f3b-b85e0b670650\") " pod="openstack/barbican-71d5-account-create-update-mjs9k" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.366870 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.382459 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scqkn\" (UniqueName: \"kubernetes.io/projected/465282ce-1312-4cb6-ae89-de6ada48a901-kube-api-access-scqkn\") pod \"keystone-db-sync-jsld8\" (UID: \"465282ce-1312-4cb6-ae89-de6ada48a901\") " pod="openstack/keystone-db-sync-jsld8" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.393481 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-e676-account-create-update-5ntgh"] Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.435135 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-71d5-account-create-update-mjs9k" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.441361 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5nz7\" (UniqueName: \"kubernetes.io/projected/efc2406b-db33-4a33-86f1-dd69b0f537a1-kube-api-access-m5nz7\") pod \"neutron-db-create-zxh6d\" (UID: \"efc2406b-db33-4a33-86f1-dd69b0f537a1\") " pod="openstack/neutron-db-create-zxh6d" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.441502 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efc2406b-db33-4a33-86f1-dd69b0f537a1-operator-scripts\") pod \"neutron-db-create-zxh6d\" (UID: \"efc2406b-db33-4a33-86f1-dd69b0f537a1\") " pod="openstack/neutron-db-create-zxh6d" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.543081 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5nz7\" (UniqueName: \"kubernetes.io/projected/efc2406b-db33-4a33-86f1-dd69b0f537a1-kube-api-access-m5nz7\") pod \"neutron-db-create-zxh6d\" (UID: \"efc2406b-db33-4a33-86f1-dd69b0f537a1\") " pod="openstack/neutron-db-create-zxh6d" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.543441 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46b17f7c-595d-4b78-9076-037fb2998f60-operator-scripts\") pod \"neutron-e676-account-create-update-5ntgh\" (UID: \"46b17f7c-595d-4b78-9076-037fb2998f60\") " pod="openstack/neutron-e676-account-create-update-5ntgh" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.543463 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn2p8\" (UniqueName: \"kubernetes.io/projected/46b17f7c-595d-4b78-9076-037fb2998f60-kube-api-access-tn2p8\") pod \"neutron-e676-account-create-update-5ntgh\" (UID: \"46b17f7c-595d-4b78-9076-037fb2998f60\") " pod="openstack/neutron-e676-account-create-update-5ntgh" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.543509 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efc2406b-db33-4a33-86f1-dd69b0f537a1-operator-scripts\") pod \"neutron-db-create-zxh6d\" (UID: \"efc2406b-db33-4a33-86f1-dd69b0f537a1\") " pod="openstack/neutron-db-create-zxh6d" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.544393 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efc2406b-db33-4a33-86f1-dd69b0f537a1-operator-scripts\") pod \"neutron-db-create-zxh6d\" (UID: \"efc2406b-db33-4a33-86f1-dd69b0f537a1\") " pod="openstack/neutron-db-create-zxh6d" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.578534 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5nz7\" (UniqueName: \"kubernetes.io/projected/efc2406b-db33-4a33-86f1-dd69b0f537a1-kube-api-access-m5nz7\") pod \"neutron-db-create-zxh6d\" (UID: \"efc2406b-db33-4a33-86f1-dd69b0f537a1\") " pod="openstack/neutron-db-create-zxh6d" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.618297 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jsld8" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.645676 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46b17f7c-595d-4b78-9076-037fb2998f60-operator-scripts\") pod \"neutron-e676-account-create-update-5ntgh\" (UID: \"46b17f7c-595d-4b78-9076-037fb2998f60\") " pod="openstack/neutron-e676-account-create-update-5ntgh" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.645714 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn2p8\" (UniqueName: \"kubernetes.io/projected/46b17f7c-595d-4b78-9076-037fb2998f60-kube-api-access-tn2p8\") pod \"neutron-e676-account-create-update-5ntgh\" (UID: \"46b17f7c-595d-4b78-9076-037fb2998f60\") " pod="openstack/neutron-e676-account-create-update-5ntgh" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.646592 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46b17f7c-595d-4b78-9076-037fb2998f60-operator-scripts\") pod \"neutron-e676-account-create-update-5ntgh\" (UID: \"46b17f7c-595d-4b78-9076-037fb2998f60\") " pod="openstack/neutron-e676-account-create-update-5ntgh" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.658997 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zxh6d" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.669670 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn2p8\" (UniqueName: \"kubernetes.io/projected/46b17f7c-595d-4b78-9076-037fb2998f60-kube-api-access-tn2p8\") pod \"neutron-e676-account-create-update-5ntgh\" (UID: \"46b17f7c-595d-4b78-9076-037fb2998f60\") " pod="openstack/neutron-e676-account-create-update-5ntgh" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.691243 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e676-account-create-update-5ntgh" Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.728267 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-t8rmw"] Mar 01 09:27:45 crc kubenswrapper[4792]: W0301 09:27:45.849885 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0b42afb_2954_442e_bc91_4c8275a4d2fd.slice/crio-6ef24e8a781a92526aa634263b2543452521c2477d0c97fecbfe90d267730b1c WatchSource:0}: Error finding container 6ef24e8a781a92526aa634263b2543452521c2477d0c97fecbfe90d267730b1c: Status 404 returned error can't find the container with id 6ef24e8a781a92526aa634263b2543452521c2477d0c97fecbfe90d267730b1c Mar 01 09:27:45 crc kubenswrapper[4792]: I0301 09:27:45.943132 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8714-account-create-update-wzssg"] Mar 01 09:27:46 crc kubenswrapper[4792]: I0301 09:27:46.151598 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-71d5-account-create-update-mjs9k"] Mar 01 09:27:46 crc kubenswrapper[4792]: I0301 09:27:46.292286 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-jsld8"] Mar 01 09:27:46 crc kubenswrapper[4792]: I0301 09:27:46.333190 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-kv8gv"] Mar 01 09:27:46 crc kubenswrapper[4792]: I0301 09:27:46.421493 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:27:46 crc kubenswrapper[4792]: I0301 09:27:46.443558 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-e676-account-create-update-5ntgh"] Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:46.473123 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-zxh6d"] Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:46.508600 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-mz86z"] Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:46.509133 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" podUID="b3682585-f554-4a65-86cb-096243ccc793" containerName="dnsmasq-dns" containerID="cri-o://6049c60340fa24dee1fdddb897ca32ccd559a6ca6bdacf1ad0f33b624dbf4865" gracePeriod=10 Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:46.808128 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-71d5-account-create-update-mjs9k" event={"ID":"dabb3d2e-57fa-4ad3-9f3b-b85e0b670650","Type":"ContainerStarted","Data":"1cc52ebb7e1b86f46dbab0e11949d60082faaf96962f0529b5c27c6156f59218"} Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:46.808188 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-71d5-account-create-update-mjs9k" event={"ID":"dabb3d2e-57fa-4ad3-9f3b-b85e0b670650","Type":"ContainerStarted","Data":"212a58af65a44bd380179af8061fb1773b3c54204d964227d1d4aa6b00521785"} Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:46.830034 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-71d5-account-create-update-mjs9k" podStartSLOduration=1.830018323 podStartE2EDuration="1.830018323s" podCreationTimestamp="2026-03-01 09:27:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:27:46.823833891 +0000 UTC m=+1196.065713088" watchObservedRunningTime="2026-03-01 09:27:46.830018323 +0000 UTC m=+1196.071897520" Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:46.830415 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8714-account-create-update-wzssg" event={"ID":"b715bb3f-b181-4614-85c5-9155286ce80c","Type":"ContainerStarted","Data":"d1c76ac502d7f1c626951dc28dbcf8372a61e85a2c8a22e8bee3f4ce1c1f91c2"} Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:46.830451 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8714-account-create-update-wzssg" event={"ID":"b715bb3f-b181-4614-85c5-9155286ce80c","Type":"ContainerStarted","Data":"1ea35c12547bb348b7526cdd5cb7acad399b530142f24896499aaad66db0ae8a"} Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:46.839698 4792 generic.go:334] "Generic (PLEG): container finished" podID="f0b42afb-2954-442e-bc91-4c8275a4d2fd" containerID="d49c80ba137c1dfc24f0da7a4050addac018dbbe5eed7701b9bf0c31b472eef5" exitCode=0 Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:46.839832 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-t8rmw" event={"ID":"f0b42afb-2954-442e-bc91-4c8275a4d2fd","Type":"ContainerDied","Data":"d49c80ba137c1dfc24f0da7a4050addac018dbbe5eed7701b9bf0c31b472eef5"} Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:46.839858 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-t8rmw" event={"ID":"f0b42afb-2954-442e-bc91-4c8275a4d2fd","Type":"ContainerStarted","Data":"6ef24e8a781a92526aa634263b2543452521c2477d0c97fecbfe90d267730b1c"} Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:46.840975 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zxh6d" event={"ID":"efc2406b-db33-4a33-86f1-dd69b0f537a1","Type":"ContainerStarted","Data":"0f765eadffce7430f601cba60023ade252609311b4c4752eb936b31a4dd4037c"} Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:46.841878 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jsld8" event={"ID":"465282ce-1312-4cb6-ae89-de6ada48a901","Type":"ContainerStarted","Data":"56894213f0a89a43fb441887180443bcea8f64c8866a065497a7cd889c1c397c"} Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:46.842995 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-kv8gv" event={"ID":"bd689802-7b27-463e-a155-ed837e8594e6","Type":"ContainerStarted","Data":"eed00bc467207cc8dc0e2dcbaae3a8c2c1b42a1295709035f04dc74d0943f1f0"} Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:46.844989 4792 generic.go:334] "Generic (PLEG): container finished" podID="b3682585-f554-4a65-86cb-096243ccc793" containerID="6049c60340fa24dee1fdddb897ca32ccd559a6ca6bdacf1ad0f33b624dbf4865" exitCode=0 Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:46.845087 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" event={"ID":"b3682585-f554-4a65-86cb-096243ccc793","Type":"ContainerDied","Data":"6049c60340fa24dee1fdddb897ca32ccd559a6ca6bdacf1ad0f33b624dbf4865"} Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:46.846486 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e676-account-create-update-5ntgh" event={"ID":"46b17f7c-595d-4b78-9076-037fb2998f60","Type":"ContainerStarted","Data":"babfcbb82a67d5d4aee470254138e36f3a5f2fe5e63c17001d84f8159a1935e7"} Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:46.859364 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-8714-account-create-update-wzssg" podStartSLOduration=2.859347773 podStartE2EDuration="2.859347773s" podCreationTimestamp="2026-03-01 09:27:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:27:46.854242378 +0000 UTC m=+1196.096121585" watchObservedRunningTime="2026-03-01 09:27:46.859347773 +0000 UTC m=+1196.101226970" Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.126636 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.298493 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8m768\" (UniqueName: \"kubernetes.io/projected/b3682585-f554-4a65-86cb-096243ccc793-kube-api-access-8m768\") pod \"b3682585-f554-4a65-86cb-096243ccc793\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.298571 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-ovsdbserver-nb\") pod \"b3682585-f554-4a65-86cb-096243ccc793\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.298645 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-dns-svc\") pod \"b3682585-f554-4a65-86cb-096243ccc793\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.298868 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-config\") pod \"b3682585-f554-4a65-86cb-096243ccc793\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.298893 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-ovsdbserver-sb\") pod \"b3682585-f554-4a65-86cb-096243ccc793\" (UID: \"b3682585-f554-4a65-86cb-096243ccc793\") " Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.320994 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3682585-f554-4a65-86cb-096243ccc793-kube-api-access-8m768" (OuterVolumeSpecName: "kube-api-access-8m768") pod "b3682585-f554-4a65-86cb-096243ccc793" (UID: "b3682585-f554-4a65-86cb-096243ccc793"). InnerVolumeSpecName "kube-api-access-8m768". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.363264 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b3682585-f554-4a65-86cb-096243ccc793" (UID: "b3682585-f554-4a65-86cb-096243ccc793"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.377191 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b3682585-f554-4a65-86cb-096243ccc793" (UID: "b3682585-f554-4a65-86cb-096243ccc793"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.377264 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-config" (OuterVolumeSpecName: "config") pod "b3682585-f554-4a65-86cb-096243ccc793" (UID: "b3682585-f554-4a65-86cb-096243ccc793"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.400991 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.401015 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.401026 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8m768\" (UniqueName: \"kubernetes.io/projected/b3682585-f554-4a65-86cb-096243ccc793-kube-api-access-8m768\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.401034 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.409688 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b3682585-f554-4a65-86cb-096243ccc793" (UID: "b3682585-f554-4a65-86cb-096243ccc793"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:47 crc kubenswrapper[4792]: E0301 09:27:47.477305 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefc2406b_db33_4a33_86f1_dd69b0f537a1.slice/crio-3d36f5fe200b4f79f67807598d19a358fe63f35f70500bebea7ecf29d4c8c11d.scope\": RecentStats: unable to find data in memory cache]" Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.503086 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3682585-f554-4a65-86cb-096243ccc793-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.862460 4792 generic.go:334] "Generic (PLEG): container finished" podID="bd689802-7b27-463e-a155-ed837e8594e6" containerID="a5333afd5d7c2f19e4d0551bd45c113ef37b9f8fcc1a7b85eb962769ca9d63e5" exitCode=0 Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.862529 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-kv8gv" event={"ID":"bd689802-7b27-463e-a155-ed837e8594e6","Type":"ContainerDied","Data":"a5333afd5d7c2f19e4d0551bd45c113ef37b9f8fcc1a7b85eb962769ca9d63e5"} Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.865152 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" event={"ID":"b3682585-f554-4a65-86cb-096243ccc793","Type":"ContainerDied","Data":"b2e8c06688804bde6de3674de22e963e00a7db65a6fb4924d8a98f95171a76cc"} Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.865213 4792 scope.go:117] "RemoveContainer" containerID="6049c60340fa24dee1fdddb897ca32ccd559a6ca6bdacf1ad0f33b624dbf4865" Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.865323 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-mz86z" Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.867864 4792 generic.go:334] "Generic (PLEG): container finished" podID="46b17f7c-595d-4b78-9076-037fb2998f60" containerID="2a9eb88c21c0505fd080c3b8fba46cc255546b5fb4c130561920988c70383a89" exitCode=0 Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.867922 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e676-account-create-update-5ntgh" event={"ID":"46b17f7c-595d-4b78-9076-037fb2998f60","Type":"ContainerDied","Data":"2a9eb88c21c0505fd080c3b8fba46cc255546b5fb4c130561920988c70383a89"} Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.871176 4792 generic.go:334] "Generic (PLEG): container finished" podID="dabb3d2e-57fa-4ad3-9f3b-b85e0b670650" containerID="1cc52ebb7e1b86f46dbab0e11949d60082faaf96962f0529b5c27c6156f59218" exitCode=0 Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.871236 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-71d5-account-create-update-mjs9k" event={"ID":"dabb3d2e-57fa-4ad3-9f3b-b85e0b670650","Type":"ContainerDied","Data":"1cc52ebb7e1b86f46dbab0e11949d60082faaf96962f0529b5c27c6156f59218"} Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.872988 4792 generic.go:334] "Generic (PLEG): container finished" podID="b715bb3f-b181-4614-85c5-9155286ce80c" containerID="d1c76ac502d7f1c626951dc28dbcf8372a61e85a2c8a22e8bee3f4ce1c1f91c2" exitCode=0 Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.873035 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8714-account-create-update-wzssg" event={"ID":"b715bb3f-b181-4614-85c5-9155286ce80c","Type":"ContainerDied","Data":"d1c76ac502d7f1c626951dc28dbcf8372a61e85a2c8a22e8bee3f4ce1c1f91c2"} Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.878414 4792 generic.go:334] "Generic (PLEG): container finished" podID="efc2406b-db33-4a33-86f1-dd69b0f537a1" containerID="3d36f5fe200b4f79f67807598d19a358fe63f35f70500bebea7ecf29d4c8c11d" exitCode=0 Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.878599 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zxh6d" event={"ID":"efc2406b-db33-4a33-86f1-dd69b0f537a1","Type":"ContainerDied","Data":"3d36f5fe200b4f79f67807598d19a358fe63f35f70500bebea7ecf29d4c8c11d"} Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.921443 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-mz86z"] Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.928127 4792 scope.go:117] "RemoveContainer" containerID="0ca19db3fcc227c24e95850e002b21aaf1788482cc19ab24284a2d399a8eb0fd" Mar 01 09:27:47 crc kubenswrapper[4792]: I0301 09:27:47.931765 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-mz86z"] Mar 01 09:27:48 crc kubenswrapper[4792]: I0301 09:27:48.148809 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-t8rmw" Mar 01 09:27:48 crc kubenswrapper[4792]: I0301 09:27:48.326236 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0b42afb-2954-442e-bc91-4c8275a4d2fd-operator-scripts\") pod \"f0b42afb-2954-442e-bc91-4c8275a4d2fd\" (UID: \"f0b42afb-2954-442e-bc91-4c8275a4d2fd\") " Mar 01 09:27:48 crc kubenswrapper[4792]: I0301 09:27:48.326338 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkfk2\" (UniqueName: \"kubernetes.io/projected/f0b42afb-2954-442e-bc91-4c8275a4d2fd-kube-api-access-nkfk2\") pod \"f0b42afb-2954-442e-bc91-4c8275a4d2fd\" (UID: \"f0b42afb-2954-442e-bc91-4c8275a4d2fd\") " Mar 01 09:27:48 crc kubenswrapper[4792]: I0301 09:27:48.326997 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0b42afb-2954-442e-bc91-4c8275a4d2fd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f0b42afb-2954-442e-bc91-4c8275a4d2fd" (UID: "f0b42afb-2954-442e-bc91-4c8275a4d2fd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:48 crc kubenswrapper[4792]: I0301 09:27:48.329582 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0b42afb-2954-442e-bc91-4c8275a4d2fd-kube-api-access-nkfk2" (OuterVolumeSpecName: "kube-api-access-nkfk2") pod "f0b42afb-2954-442e-bc91-4c8275a4d2fd" (UID: "f0b42afb-2954-442e-bc91-4c8275a4d2fd"). InnerVolumeSpecName "kube-api-access-nkfk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:48 crc kubenswrapper[4792]: I0301 09:27:48.427998 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0b42afb-2954-442e-bc91-4c8275a4d2fd-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:48 crc kubenswrapper[4792]: I0301 09:27:48.428034 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkfk2\" (UniqueName: \"kubernetes.io/projected/f0b42afb-2954-442e-bc91-4c8275a4d2fd-kube-api-access-nkfk2\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:48 crc kubenswrapper[4792]: I0301 09:27:48.887297 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-t8rmw" event={"ID":"f0b42afb-2954-442e-bc91-4c8275a4d2fd","Type":"ContainerDied","Data":"6ef24e8a781a92526aa634263b2543452521c2477d0c97fecbfe90d267730b1c"} Mar 01 09:27:48 crc kubenswrapper[4792]: I0301 09:27:48.887350 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ef24e8a781a92526aa634263b2543452521c2477d0c97fecbfe90d267730b1c" Mar 01 09:27:48 crc kubenswrapper[4792]: I0301 09:27:48.887316 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-t8rmw" Mar 01 09:27:49 crc kubenswrapper[4792]: I0301 09:27:49.418441 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3682585-f554-4a65-86cb-096243ccc793" path="/var/lib/kubelet/pods/b3682585-f554-4a65-86cb-096243ccc793/volumes" Mar 01 09:27:50 crc kubenswrapper[4792]: I0301 09:27:50.909787 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e676-account-create-update-5ntgh" event={"ID":"46b17f7c-595d-4b78-9076-037fb2998f60","Type":"ContainerDied","Data":"babfcbb82a67d5d4aee470254138e36f3a5f2fe5e63c17001d84f8159a1935e7"} Mar 01 09:27:50 crc kubenswrapper[4792]: I0301 09:27:50.910142 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="babfcbb82a67d5d4aee470254138e36f3a5f2fe5e63c17001d84f8159a1935e7" Mar 01 09:27:50 crc kubenswrapper[4792]: I0301 09:27:50.912059 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-71d5-account-create-update-mjs9k" event={"ID":"dabb3d2e-57fa-4ad3-9f3b-b85e0b670650","Type":"ContainerDied","Data":"212a58af65a44bd380179af8061fb1773b3c54204d964227d1d4aa6b00521785"} Mar 01 09:27:50 crc kubenswrapper[4792]: I0301 09:27:50.912087 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="212a58af65a44bd380179af8061fb1773b3c54204d964227d1d4aa6b00521785" Mar 01 09:27:50 crc kubenswrapper[4792]: I0301 09:27:50.913767 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8714-account-create-update-wzssg" event={"ID":"b715bb3f-b181-4614-85c5-9155286ce80c","Type":"ContainerDied","Data":"1ea35c12547bb348b7526cdd5cb7acad399b530142f24896499aaad66db0ae8a"} Mar 01 09:27:50 crc kubenswrapper[4792]: I0301 09:27:50.913795 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ea35c12547bb348b7526cdd5cb7acad399b530142f24896499aaad66db0ae8a" Mar 01 09:27:50 crc kubenswrapper[4792]: I0301 09:27:50.915982 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zxh6d" event={"ID":"efc2406b-db33-4a33-86f1-dd69b0f537a1","Type":"ContainerDied","Data":"0f765eadffce7430f601cba60023ade252609311b4c4752eb936b31a4dd4037c"} Mar 01 09:27:50 crc kubenswrapper[4792]: I0301 09:27:50.916038 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f765eadffce7430f601cba60023ade252609311b4c4752eb936b31a4dd4037c" Mar 01 09:27:50 crc kubenswrapper[4792]: I0301 09:27:50.918625 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-kv8gv" event={"ID":"bd689802-7b27-463e-a155-ed837e8594e6","Type":"ContainerDied","Data":"eed00bc467207cc8dc0e2dcbaae3a8c2c1b42a1295709035f04dc74d0943f1f0"} Mar 01 09:27:50 crc kubenswrapper[4792]: I0301 09:27:50.918652 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eed00bc467207cc8dc0e2dcbaae3a8c2c1b42a1295709035f04dc74d0943f1f0" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.029417 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zxh6d" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.034270 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-71d5-account-create-update-mjs9k" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.053598 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e676-account-create-update-5ntgh" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.074423 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dabb3d2e-57fa-4ad3-9f3b-b85e0b670650-operator-scripts\") pod \"dabb3d2e-57fa-4ad3-9f3b-b85e0b670650\" (UID: \"dabb3d2e-57fa-4ad3-9f3b-b85e0b670650\") " Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.074489 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efc2406b-db33-4a33-86f1-dd69b0f537a1-operator-scripts\") pod \"efc2406b-db33-4a33-86f1-dd69b0f537a1\" (UID: \"efc2406b-db33-4a33-86f1-dd69b0f537a1\") " Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.074539 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-549vc\" (UniqueName: \"kubernetes.io/projected/dabb3d2e-57fa-4ad3-9f3b-b85e0b670650-kube-api-access-549vc\") pod \"dabb3d2e-57fa-4ad3-9f3b-b85e0b670650\" (UID: \"dabb3d2e-57fa-4ad3-9f3b-b85e0b670650\") " Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.074561 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5nz7\" (UniqueName: \"kubernetes.io/projected/efc2406b-db33-4a33-86f1-dd69b0f537a1-kube-api-access-m5nz7\") pod \"efc2406b-db33-4a33-86f1-dd69b0f537a1\" (UID: \"efc2406b-db33-4a33-86f1-dd69b0f537a1\") " Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.074579 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn2p8\" (UniqueName: \"kubernetes.io/projected/46b17f7c-595d-4b78-9076-037fb2998f60-kube-api-access-tn2p8\") pod \"46b17f7c-595d-4b78-9076-037fb2998f60\" (UID: \"46b17f7c-595d-4b78-9076-037fb2998f60\") " Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.074623 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46b17f7c-595d-4b78-9076-037fb2998f60-operator-scripts\") pod \"46b17f7c-595d-4b78-9076-037fb2998f60\" (UID: \"46b17f7c-595d-4b78-9076-037fb2998f60\") " Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.076296 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46b17f7c-595d-4b78-9076-037fb2998f60-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "46b17f7c-595d-4b78-9076-037fb2998f60" (UID: "46b17f7c-595d-4b78-9076-037fb2998f60"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.076963 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efc2406b-db33-4a33-86f1-dd69b0f537a1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "efc2406b-db33-4a33-86f1-dd69b0f537a1" (UID: "efc2406b-db33-4a33-86f1-dd69b0f537a1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.077134 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dabb3d2e-57fa-4ad3-9f3b-b85e0b670650-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dabb3d2e-57fa-4ad3-9f3b-b85e0b670650" (UID: "dabb3d2e-57fa-4ad3-9f3b-b85e0b670650"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.090445 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dabb3d2e-57fa-4ad3-9f3b-b85e0b670650-kube-api-access-549vc" (OuterVolumeSpecName: "kube-api-access-549vc") pod "dabb3d2e-57fa-4ad3-9f3b-b85e0b670650" (UID: "dabb3d2e-57fa-4ad3-9f3b-b85e0b670650"). InnerVolumeSpecName "kube-api-access-549vc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.092167 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46b17f7c-595d-4b78-9076-037fb2998f60-kube-api-access-tn2p8" (OuterVolumeSpecName: "kube-api-access-tn2p8") pod "46b17f7c-595d-4b78-9076-037fb2998f60" (UID: "46b17f7c-595d-4b78-9076-037fb2998f60"). InnerVolumeSpecName "kube-api-access-tn2p8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.094689 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efc2406b-db33-4a33-86f1-dd69b0f537a1-kube-api-access-m5nz7" (OuterVolumeSpecName: "kube-api-access-m5nz7") pod "efc2406b-db33-4a33-86f1-dd69b0f537a1" (UID: "efc2406b-db33-4a33-86f1-dd69b0f537a1"). InnerVolumeSpecName "kube-api-access-m5nz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.136235 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8714-account-create-update-wzssg" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.139900 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-kv8gv" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.181265 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dabb3d2e-57fa-4ad3-9f3b-b85e0b670650-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.181339 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efc2406b-db33-4a33-86f1-dd69b0f537a1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.181378 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-549vc\" (UniqueName: \"kubernetes.io/projected/dabb3d2e-57fa-4ad3-9f3b-b85e0b670650-kube-api-access-549vc\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.181398 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5nz7\" (UniqueName: \"kubernetes.io/projected/efc2406b-db33-4a33-86f1-dd69b0f537a1-kube-api-access-m5nz7\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.181424 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tn2p8\" (UniqueName: \"kubernetes.io/projected/46b17f7c-595d-4b78-9076-037fb2998f60-kube-api-access-tn2p8\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.181433 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46b17f7c-595d-4b78-9076-037fb2998f60-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.282216 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd689802-7b27-463e-a155-ed837e8594e6-operator-scripts\") pod \"bd689802-7b27-463e-a155-ed837e8594e6\" (UID: \"bd689802-7b27-463e-a155-ed837e8594e6\") " Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.282384 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b715bb3f-b181-4614-85c5-9155286ce80c-operator-scripts\") pod \"b715bb3f-b181-4614-85c5-9155286ce80c\" (UID: \"b715bb3f-b181-4614-85c5-9155286ce80c\") " Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.282473 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgch2\" (UniqueName: \"kubernetes.io/projected/bd689802-7b27-463e-a155-ed837e8594e6-kube-api-access-lgch2\") pod \"bd689802-7b27-463e-a155-ed837e8594e6\" (UID: \"bd689802-7b27-463e-a155-ed837e8594e6\") " Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.282495 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djxnh\" (UniqueName: \"kubernetes.io/projected/b715bb3f-b181-4614-85c5-9155286ce80c-kube-api-access-djxnh\") pod \"b715bb3f-b181-4614-85c5-9155286ce80c\" (UID: \"b715bb3f-b181-4614-85c5-9155286ce80c\") " Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.282756 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd689802-7b27-463e-a155-ed837e8594e6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bd689802-7b27-463e-a155-ed837e8594e6" (UID: "bd689802-7b27-463e-a155-ed837e8594e6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.283243 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd689802-7b27-463e-a155-ed837e8594e6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.283372 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b715bb3f-b181-4614-85c5-9155286ce80c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b715bb3f-b181-4614-85c5-9155286ce80c" (UID: "b715bb3f-b181-4614-85c5-9155286ce80c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.285698 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b715bb3f-b181-4614-85c5-9155286ce80c-kube-api-access-djxnh" (OuterVolumeSpecName: "kube-api-access-djxnh") pod "b715bb3f-b181-4614-85c5-9155286ce80c" (UID: "b715bb3f-b181-4614-85c5-9155286ce80c"). InnerVolumeSpecName "kube-api-access-djxnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.287883 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd689802-7b27-463e-a155-ed837e8594e6-kube-api-access-lgch2" (OuterVolumeSpecName: "kube-api-access-lgch2") pod "bd689802-7b27-463e-a155-ed837e8594e6" (UID: "bd689802-7b27-463e-a155-ed837e8594e6"). InnerVolumeSpecName "kube-api-access-lgch2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.384849 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b715bb3f-b181-4614-85c5-9155286ce80c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.384896 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgch2\" (UniqueName: \"kubernetes.io/projected/bd689802-7b27-463e-a155-ed837e8594e6-kube-api-access-lgch2\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.384931 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djxnh\" (UniqueName: \"kubernetes.io/projected/b715bb3f-b181-4614-85c5-9155286ce80c-kube-api-access-djxnh\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.927098 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jsld8" event={"ID":"465282ce-1312-4cb6-ae89-de6ada48a901","Type":"ContainerStarted","Data":"ca00fe55b9ec531c46f7a5dbc120ce6185518403d6904d883bc1ec756288e2a0"} Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.927223 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-kv8gv" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.927266 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zxh6d" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.927214 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-71d5-account-create-update-mjs9k" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.927269 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8714-account-create-update-wzssg" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.927372 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e676-account-create-update-5ntgh" Mar 01 09:27:51 crc kubenswrapper[4792]: I0301 09:27:51.946400 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-jsld8" podStartSLOduration=2.37540398 podStartE2EDuration="6.946382109s" podCreationTimestamp="2026-03-01 09:27:45 +0000 UTC" firstStartedPulling="2026-03-01 09:27:46.331957297 +0000 UTC m=+1195.573836484" lastFinishedPulling="2026-03-01 09:27:50.902935416 +0000 UTC m=+1200.144814613" observedRunningTime="2026-03-01 09:27:51.940860243 +0000 UTC m=+1201.182739440" watchObservedRunningTime="2026-03-01 09:27:51.946382109 +0000 UTC m=+1201.188261316" Mar 01 09:27:54 crc kubenswrapper[4792]: I0301 09:27:54.949754 4792 generic.go:334] "Generic (PLEG): container finished" podID="465282ce-1312-4cb6-ae89-de6ada48a901" containerID="ca00fe55b9ec531c46f7a5dbc120ce6185518403d6904d883bc1ec756288e2a0" exitCode=0 Mar 01 09:27:54 crc kubenswrapper[4792]: I0301 09:27:54.949994 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jsld8" event={"ID":"465282ce-1312-4cb6-ae89-de6ada48a901","Type":"ContainerDied","Data":"ca00fe55b9ec531c46f7a5dbc120ce6185518403d6904d883bc1ec756288e2a0"} Mar 01 09:27:56 crc kubenswrapper[4792]: I0301 09:27:56.243809 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jsld8" Mar 01 09:27:56 crc kubenswrapper[4792]: I0301 09:27:56.276644 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scqkn\" (UniqueName: \"kubernetes.io/projected/465282ce-1312-4cb6-ae89-de6ada48a901-kube-api-access-scqkn\") pod \"465282ce-1312-4cb6-ae89-de6ada48a901\" (UID: \"465282ce-1312-4cb6-ae89-de6ada48a901\") " Mar 01 09:27:56 crc kubenswrapper[4792]: I0301 09:27:56.276795 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/465282ce-1312-4cb6-ae89-de6ada48a901-combined-ca-bundle\") pod \"465282ce-1312-4cb6-ae89-de6ada48a901\" (UID: \"465282ce-1312-4cb6-ae89-de6ada48a901\") " Mar 01 09:27:56 crc kubenswrapper[4792]: I0301 09:27:56.276883 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/465282ce-1312-4cb6-ae89-de6ada48a901-config-data\") pod \"465282ce-1312-4cb6-ae89-de6ada48a901\" (UID: \"465282ce-1312-4cb6-ae89-de6ada48a901\") " Mar 01 09:27:56 crc kubenswrapper[4792]: I0301 09:27:56.298654 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/465282ce-1312-4cb6-ae89-de6ada48a901-kube-api-access-scqkn" (OuterVolumeSpecName: "kube-api-access-scqkn") pod "465282ce-1312-4cb6-ae89-de6ada48a901" (UID: "465282ce-1312-4cb6-ae89-de6ada48a901"). InnerVolumeSpecName "kube-api-access-scqkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:56 crc kubenswrapper[4792]: I0301 09:27:56.305167 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/465282ce-1312-4cb6-ae89-de6ada48a901-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "465282ce-1312-4cb6-ae89-de6ada48a901" (UID: "465282ce-1312-4cb6-ae89-de6ada48a901"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:27:56 crc kubenswrapper[4792]: I0301 09:27:56.334425 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/465282ce-1312-4cb6-ae89-de6ada48a901-config-data" (OuterVolumeSpecName: "config-data") pod "465282ce-1312-4cb6-ae89-de6ada48a901" (UID: "465282ce-1312-4cb6-ae89-de6ada48a901"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:27:56 crc kubenswrapper[4792]: I0301 09:27:56.378716 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/465282ce-1312-4cb6-ae89-de6ada48a901-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:56 crc kubenswrapper[4792]: I0301 09:27:56.378837 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/465282ce-1312-4cb6-ae89-de6ada48a901-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:56 crc kubenswrapper[4792]: I0301 09:27:56.378892 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scqkn\" (UniqueName: \"kubernetes.io/projected/465282ce-1312-4cb6-ae89-de6ada48a901-kube-api-access-scqkn\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:56 crc kubenswrapper[4792]: I0301 09:27:56.968302 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jsld8" event={"ID":"465282ce-1312-4cb6-ae89-de6ada48a901","Type":"ContainerDied","Data":"56894213f0a89a43fb441887180443bcea8f64c8866a065497a7cd889c1c397c"} Mar 01 09:27:56 crc kubenswrapper[4792]: I0301 09:27:56.968353 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56894213f0a89a43fb441887180443bcea8f64c8866a065497a7cd889c1c397c" Mar 01 09:27:56 crc kubenswrapper[4792]: I0301 09:27:56.968400 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jsld8" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.215738 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7796644bc-dm4ww"] Mar 01 09:27:57 crc kubenswrapper[4792]: E0301 09:27:57.216407 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3682585-f554-4a65-86cb-096243ccc793" containerName="dnsmasq-dns" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.216427 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3682585-f554-4a65-86cb-096243ccc793" containerName="dnsmasq-dns" Mar 01 09:27:57 crc kubenswrapper[4792]: E0301 09:27:57.216436 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46b17f7c-595d-4b78-9076-037fb2998f60" containerName="mariadb-account-create-update" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.216443 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="46b17f7c-595d-4b78-9076-037fb2998f60" containerName="mariadb-account-create-update" Mar 01 09:27:57 crc kubenswrapper[4792]: E0301 09:27:57.216456 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd689802-7b27-463e-a155-ed837e8594e6" containerName="mariadb-database-create" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.216463 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd689802-7b27-463e-a155-ed837e8594e6" containerName="mariadb-database-create" Mar 01 09:27:57 crc kubenswrapper[4792]: E0301 09:27:57.216474 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="465282ce-1312-4cb6-ae89-de6ada48a901" containerName="keystone-db-sync" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.216479 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="465282ce-1312-4cb6-ae89-de6ada48a901" containerName="keystone-db-sync" Mar 01 09:27:57 crc kubenswrapper[4792]: E0301 09:27:57.216488 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3682585-f554-4a65-86cb-096243ccc793" containerName="init" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.216494 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3682585-f554-4a65-86cb-096243ccc793" containerName="init" Mar 01 09:27:57 crc kubenswrapper[4792]: E0301 09:27:57.216504 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b715bb3f-b181-4614-85c5-9155286ce80c" containerName="mariadb-account-create-update" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.216512 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b715bb3f-b181-4614-85c5-9155286ce80c" containerName="mariadb-account-create-update" Mar 01 09:27:57 crc kubenswrapper[4792]: E0301 09:27:57.216521 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efc2406b-db33-4a33-86f1-dd69b0f537a1" containerName="mariadb-database-create" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.216527 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="efc2406b-db33-4a33-86f1-dd69b0f537a1" containerName="mariadb-database-create" Mar 01 09:27:57 crc kubenswrapper[4792]: E0301 09:27:57.216539 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0b42afb-2954-442e-bc91-4c8275a4d2fd" containerName="mariadb-database-create" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.216545 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0b42afb-2954-442e-bc91-4c8275a4d2fd" containerName="mariadb-database-create" Mar 01 09:27:57 crc kubenswrapper[4792]: E0301 09:27:57.216551 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dabb3d2e-57fa-4ad3-9f3b-b85e0b670650" containerName="mariadb-account-create-update" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.216556 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="dabb3d2e-57fa-4ad3-9f3b-b85e0b670650" containerName="mariadb-account-create-update" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.216692 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3682585-f554-4a65-86cb-096243ccc793" containerName="dnsmasq-dns" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.216709 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="dabb3d2e-57fa-4ad3-9f3b-b85e0b670650" containerName="mariadb-account-create-update" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.216720 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b715bb3f-b181-4614-85c5-9155286ce80c" containerName="mariadb-account-create-update" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.216729 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0b42afb-2954-442e-bc91-4c8275a4d2fd" containerName="mariadb-database-create" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.216738 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd689802-7b27-463e-a155-ed837e8594e6" containerName="mariadb-database-create" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.216748 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="46b17f7c-595d-4b78-9076-037fb2998f60" containerName="mariadb-account-create-update" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.216762 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="efc2406b-db33-4a33-86f1-dd69b0f537a1" containerName="mariadb-database-create" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.216771 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="465282ce-1312-4cb6-ae89-de6ada48a901" containerName="keystone-db-sync" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.217534 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.237195 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-2zjg5"] Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.238249 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.243979 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.243982 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.244039 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-fr9vh" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.244108 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.244793 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.255802 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7796644bc-dm4ww"] Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.265585 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-2zjg5"] Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.295002 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-config-data\") pod \"keystone-bootstrap-2zjg5\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.295034 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-scripts\") pod \"keystone-bootstrap-2zjg5\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.295076 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-credential-keys\") pod \"keystone-bootstrap-2zjg5\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.295092 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-dns-svc\") pod \"dnsmasq-dns-7796644bc-dm4ww\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.295115 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-ovsdbserver-sb\") pod \"dnsmasq-dns-7796644bc-dm4ww\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.295145 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-combined-ca-bundle\") pod \"keystone-bootstrap-2zjg5\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.295170 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxwh8\" (UniqueName: \"kubernetes.io/projected/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-kube-api-access-jxwh8\") pod \"keystone-bootstrap-2zjg5\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.295202 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mwlc\" (UniqueName: \"kubernetes.io/projected/30b802af-3af5-430f-b06f-709fd4606fd0-kube-api-access-4mwlc\") pod \"dnsmasq-dns-7796644bc-dm4ww\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.295219 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-config\") pod \"dnsmasq-dns-7796644bc-dm4ww\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.295258 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-ovsdbserver-nb\") pod \"dnsmasq-dns-7796644bc-dm4ww\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.295295 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-fernet-keys\") pod \"keystone-bootstrap-2zjg5\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.396156 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxwh8\" (UniqueName: \"kubernetes.io/projected/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-kube-api-access-jxwh8\") pod \"keystone-bootstrap-2zjg5\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.396198 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mwlc\" (UniqueName: \"kubernetes.io/projected/30b802af-3af5-430f-b06f-709fd4606fd0-kube-api-access-4mwlc\") pod \"dnsmasq-dns-7796644bc-dm4ww\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.396216 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-config\") pod \"dnsmasq-dns-7796644bc-dm4ww\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.396266 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-ovsdbserver-nb\") pod \"dnsmasq-dns-7796644bc-dm4ww\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.396304 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-fernet-keys\") pod \"keystone-bootstrap-2zjg5\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.396343 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-config-data\") pod \"keystone-bootstrap-2zjg5\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.396363 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-scripts\") pod \"keystone-bootstrap-2zjg5\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.396383 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-credential-keys\") pod \"keystone-bootstrap-2zjg5\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.396398 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-dns-svc\") pod \"dnsmasq-dns-7796644bc-dm4ww\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.396417 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-ovsdbserver-sb\") pod \"dnsmasq-dns-7796644bc-dm4ww\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.396441 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-combined-ca-bundle\") pod \"keystone-bootstrap-2zjg5\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.397967 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-dns-svc\") pod \"dnsmasq-dns-7796644bc-dm4ww\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.398002 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-ovsdbserver-sb\") pod \"dnsmasq-dns-7796644bc-dm4ww\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.398505 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-ovsdbserver-nb\") pod \"dnsmasq-dns-7796644bc-dm4ww\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.398644 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-config\") pod \"dnsmasq-dns-7796644bc-dm4ww\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.403805 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-credential-keys\") pod \"keystone-bootstrap-2zjg5\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.403887 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-scripts\") pod \"keystone-bootstrap-2zjg5\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.404031 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-combined-ca-bundle\") pod \"keystone-bootstrap-2zjg5\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.404464 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-fernet-keys\") pod \"keystone-bootstrap-2zjg5\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.409569 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-config-data\") pod \"keystone-bootstrap-2zjg5\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.450503 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mwlc\" (UniqueName: \"kubernetes.io/projected/30b802af-3af5-430f-b06f-709fd4606fd0-kube-api-access-4mwlc\") pod \"dnsmasq-dns-7796644bc-dm4ww\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.465986 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxwh8\" (UniqueName: \"kubernetes.io/projected/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-kube-api-access-jxwh8\") pod \"keystone-bootstrap-2zjg5\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.534293 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.551311 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.560841 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.567013 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.567502 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.590124 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.601843 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.601891 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-config-data\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.601938 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.601959 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-scripts\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.602008 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-run-httpd\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.602042 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-log-httpd\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.602072 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgczn\" (UniqueName: \"kubernetes.io/projected/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-kube-api-access-zgczn\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.604545 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-gsxqb"] Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.605857 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.621666 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.621843 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-7rpd7" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.621960 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.649623 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.701952 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-gbmwh"] Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.702960 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-gbmwh" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.704546 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.704593 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-scripts\") pod \"cinder-db-sync-gsxqb\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.704619 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-config-data\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.704640 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.704658 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-scripts\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.704698 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/737aa0a0-6e53-451e-9d5f-2deada87b5b4-etc-machine-id\") pod \"cinder-db-sync-gsxqb\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.704718 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-config-data\") pod \"cinder-db-sync-gsxqb\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.704732 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-combined-ca-bundle\") pod \"cinder-db-sync-gsxqb\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.704757 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-run-httpd\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.704775 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84xmq\" (UniqueName: \"kubernetes.io/projected/737aa0a0-6e53-451e-9d5f-2deada87b5b4-kube-api-access-84xmq\") pod \"cinder-db-sync-gsxqb\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.704810 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-log-httpd\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.704839 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-db-sync-config-data\") pod \"cinder-db-sync-gsxqb\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.704855 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgczn\" (UniqueName: \"kubernetes.io/projected/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-kube-api-access-zgczn\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.712053 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.712237 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-k82sx" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.712335 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.712704 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-run-httpd\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.715152 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-log-httpd\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.717638 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-config-data\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.726406 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.729151 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.737768 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-scripts\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.740659 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgczn\" (UniqueName: \"kubernetes.io/projected/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-kube-api-access-zgczn\") pod \"ceilometer-0\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " pod="openstack/ceilometer-0" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.743246 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-gsxqb"] Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.787979 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-gbmwh"] Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.806452 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/66aba873-81b0-452a-81f9-73cc18445180-config\") pod \"neutron-db-sync-gbmwh\" (UID: \"66aba873-81b0-452a-81f9-73cc18445180\") " pod="openstack/neutron-db-sync-gbmwh" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.806580 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/737aa0a0-6e53-451e-9d5f-2deada87b5b4-etc-machine-id\") pod \"cinder-db-sync-gsxqb\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.806606 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66aba873-81b0-452a-81f9-73cc18445180-combined-ca-bundle\") pod \"neutron-db-sync-gbmwh\" (UID: \"66aba873-81b0-452a-81f9-73cc18445180\") " pod="openstack/neutron-db-sync-gbmwh" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.806624 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-config-data\") pod \"cinder-db-sync-gsxqb\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.806641 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-combined-ca-bundle\") pod \"cinder-db-sync-gsxqb\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.806683 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84xmq\" (UniqueName: \"kubernetes.io/projected/737aa0a0-6e53-451e-9d5f-2deada87b5b4-kube-api-access-84xmq\") pod \"cinder-db-sync-gsxqb\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.806731 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-db-sync-config-data\") pod \"cinder-db-sync-gsxqb\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.806770 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95d98\" (UniqueName: \"kubernetes.io/projected/66aba873-81b0-452a-81f9-73cc18445180-kube-api-access-95d98\") pod \"neutron-db-sync-gbmwh\" (UID: \"66aba873-81b0-452a-81f9-73cc18445180\") " pod="openstack/neutron-db-sync-gbmwh" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.806814 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-scripts\") pod \"cinder-db-sync-gsxqb\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.807217 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/737aa0a0-6e53-451e-9d5f-2deada87b5b4-etc-machine-id\") pod \"cinder-db-sync-gsxqb\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.812657 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-combined-ca-bundle\") pod \"cinder-db-sync-gsxqb\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.814147 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-db-sync-config-data\") pod \"cinder-db-sync-gsxqb\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.818484 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-config-data\") pod \"cinder-db-sync-gsxqb\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.818840 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-scripts\") pod \"cinder-db-sync-gsxqb\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.845405 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84xmq\" (UniqueName: \"kubernetes.io/projected/737aa0a0-6e53-451e-9d5f-2deada87b5b4-kube-api-access-84xmq\") pod \"cinder-db-sync-gsxqb\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.854128 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7796644bc-dm4ww"] Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.862872 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-bxx5d"] Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.864093 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-bxx5d" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.867342 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-wjs57" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.868285 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.879208 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-bxx5d"] Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.891149 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-f89zl"] Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.892355 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-f89zl" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.898097 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-z8zjf" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.900608 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.900801 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.906482 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-589bbb667-6gxlc"] Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.908505 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzxv7\" (UniqueName: \"kubernetes.io/projected/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd-kube-api-access-pzxv7\") pod \"barbican-db-sync-bxx5d\" (UID: \"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd\") " pod="openstack/barbican-db-sync-bxx5d" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.908597 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95d98\" (UniqueName: \"kubernetes.io/projected/66aba873-81b0-452a-81f9-73cc18445180-kube-api-access-95d98\") pod \"neutron-db-sync-gbmwh\" (UID: \"66aba873-81b0-452a-81f9-73cc18445180\") " pod="openstack/neutron-db-sync-gbmwh" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.908666 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd-db-sync-config-data\") pod \"barbican-db-sync-bxx5d\" (UID: \"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd\") " pod="openstack/barbican-db-sync-bxx5d" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.908715 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/66aba873-81b0-452a-81f9-73cc18445180-config\") pod \"neutron-db-sync-gbmwh\" (UID: \"66aba873-81b0-452a-81f9-73cc18445180\") " pod="openstack/neutron-db-sync-gbmwh" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.908753 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66aba873-81b0-452a-81f9-73cc18445180-combined-ca-bundle\") pod \"neutron-db-sync-gbmwh\" (UID: \"66aba873-81b0-452a-81f9-73cc18445180\") " pod="openstack/neutron-db-sync-gbmwh" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.908779 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd-combined-ca-bundle\") pod \"barbican-db-sync-bxx5d\" (UID: \"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd\") " pod="openstack/barbican-db-sync-bxx5d" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.913693 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.946638 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-f89zl"] Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.955209 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589bbb667-6gxlc"] Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.960837 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/66aba873-81b0-452a-81f9-73cc18445180-config\") pod \"neutron-db-sync-gbmwh\" (UID: \"66aba873-81b0-452a-81f9-73cc18445180\") " pod="openstack/neutron-db-sync-gbmwh" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.963068 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66aba873-81b0-452a-81f9-73cc18445180-combined-ca-bundle\") pod \"neutron-db-sync-gbmwh\" (UID: \"66aba873-81b0-452a-81f9-73cc18445180\") " pod="openstack/neutron-db-sync-gbmwh" Mar 01 09:27:57 crc kubenswrapper[4792]: I0301 09:27:57.963517 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95d98\" (UniqueName: \"kubernetes.io/projected/66aba873-81b0-452a-81f9-73cc18445180-kube-api-access-95d98\") pod \"neutron-db-sync-gbmwh\" (UID: \"66aba873-81b0-452a-81f9-73cc18445180\") " pod="openstack/neutron-db-sync-gbmwh" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.000179 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.013064 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e623b24a-64a5-4209-86bb-1814ae9c400b-combined-ca-bundle\") pod \"placement-db-sync-f89zl\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " pod="openstack/placement-db-sync-f89zl" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.013119 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e623b24a-64a5-4209-86bb-1814ae9c400b-scripts\") pod \"placement-db-sync-f89zl\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " pod="openstack/placement-db-sync-f89zl" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.013143 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e623b24a-64a5-4209-86bb-1814ae9c400b-config-data\") pod \"placement-db-sync-f89zl\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " pod="openstack/placement-db-sync-f89zl" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.013167 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e623b24a-64a5-4209-86bb-1814ae9c400b-logs\") pod \"placement-db-sync-f89zl\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " pod="openstack/placement-db-sync-f89zl" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.013190 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv7bl\" (UniqueName: \"kubernetes.io/projected/4e1a508b-9db4-414a-b06d-2f01a2c132a1-kube-api-access-cv7bl\") pod \"dnsmasq-dns-589bbb667-6gxlc\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.013228 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-ovsdbserver-nb\") pod \"dnsmasq-dns-589bbb667-6gxlc\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.013260 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd-db-sync-config-data\") pod \"barbican-db-sync-bxx5d\" (UID: \"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd\") " pod="openstack/barbican-db-sync-bxx5d" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.013300 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f79b\" (UniqueName: \"kubernetes.io/projected/e623b24a-64a5-4209-86bb-1814ae9c400b-kube-api-access-4f79b\") pod \"placement-db-sync-f89zl\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " pod="openstack/placement-db-sync-f89zl" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.013349 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-config\") pod \"dnsmasq-dns-589bbb667-6gxlc\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.013380 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-dns-svc\") pod \"dnsmasq-dns-589bbb667-6gxlc\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.013407 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd-combined-ca-bundle\") pod \"barbican-db-sync-bxx5d\" (UID: \"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd\") " pod="openstack/barbican-db-sync-bxx5d" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.013460 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzxv7\" (UniqueName: \"kubernetes.io/projected/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd-kube-api-access-pzxv7\") pod \"barbican-db-sync-bxx5d\" (UID: \"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd\") " pod="openstack/barbican-db-sync-bxx5d" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.013504 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-ovsdbserver-sb\") pod \"dnsmasq-dns-589bbb667-6gxlc\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.017354 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd-db-sync-config-data\") pod \"barbican-db-sync-bxx5d\" (UID: \"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd\") " pod="openstack/barbican-db-sync-bxx5d" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.039528 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.039986 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd-combined-ca-bundle\") pod \"barbican-db-sync-bxx5d\" (UID: \"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd\") " pod="openstack/barbican-db-sync-bxx5d" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.040207 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzxv7\" (UniqueName: \"kubernetes.io/projected/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd-kube-api-access-pzxv7\") pod \"barbican-db-sync-bxx5d\" (UID: \"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd\") " pod="openstack/barbican-db-sync-bxx5d" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.051390 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-gbmwh" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.114960 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e623b24a-64a5-4209-86bb-1814ae9c400b-combined-ca-bundle\") pod \"placement-db-sync-f89zl\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " pod="openstack/placement-db-sync-f89zl" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.115009 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e623b24a-64a5-4209-86bb-1814ae9c400b-scripts\") pod \"placement-db-sync-f89zl\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " pod="openstack/placement-db-sync-f89zl" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.115032 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e623b24a-64a5-4209-86bb-1814ae9c400b-config-data\") pod \"placement-db-sync-f89zl\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " pod="openstack/placement-db-sync-f89zl" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.115054 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e623b24a-64a5-4209-86bb-1814ae9c400b-logs\") pod \"placement-db-sync-f89zl\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " pod="openstack/placement-db-sync-f89zl" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.115080 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv7bl\" (UniqueName: \"kubernetes.io/projected/4e1a508b-9db4-414a-b06d-2f01a2c132a1-kube-api-access-cv7bl\") pod \"dnsmasq-dns-589bbb667-6gxlc\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.115110 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-ovsdbserver-nb\") pod \"dnsmasq-dns-589bbb667-6gxlc\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.115151 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f79b\" (UniqueName: \"kubernetes.io/projected/e623b24a-64a5-4209-86bb-1814ae9c400b-kube-api-access-4f79b\") pod \"placement-db-sync-f89zl\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " pod="openstack/placement-db-sync-f89zl" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.115199 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-config\") pod \"dnsmasq-dns-589bbb667-6gxlc\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.115229 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-dns-svc\") pod \"dnsmasq-dns-589bbb667-6gxlc\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.115293 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-ovsdbserver-sb\") pod \"dnsmasq-dns-589bbb667-6gxlc\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.116316 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-ovsdbserver-sb\") pod \"dnsmasq-dns-589bbb667-6gxlc\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.116841 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-ovsdbserver-nb\") pod \"dnsmasq-dns-589bbb667-6gxlc\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.117730 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-config\") pod \"dnsmasq-dns-589bbb667-6gxlc\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.119113 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e623b24a-64a5-4209-86bb-1814ae9c400b-logs\") pod \"placement-db-sync-f89zl\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " pod="openstack/placement-db-sync-f89zl" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.119796 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-dns-svc\") pod \"dnsmasq-dns-589bbb667-6gxlc\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.121087 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e623b24a-64a5-4209-86bb-1814ae9c400b-config-data\") pod \"placement-db-sync-f89zl\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " pod="openstack/placement-db-sync-f89zl" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.121542 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e623b24a-64a5-4209-86bb-1814ae9c400b-combined-ca-bundle\") pod \"placement-db-sync-f89zl\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " pod="openstack/placement-db-sync-f89zl" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.121926 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e623b24a-64a5-4209-86bb-1814ae9c400b-scripts\") pod \"placement-db-sync-f89zl\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " pod="openstack/placement-db-sync-f89zl" Mar 01 09:27:58 crc kubenswrapper[4792]: E0301 09:27:58.129405 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddabb3d2e_57fa_4ad3_9f3b_b85e0b670650.slice/crio-212a58af65a44bd380179af8061fb1773b3c54204d964227d1d4aa6b00521785\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddabb3d2e_57fa_4ad3_9f3b_b85e0b670650.slice\": RecentStats: unable to find data in memory cache]" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.135596 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv7bl\" (UniqueName: \"kubernetes.io/projected/4e1a508b-9db4-414a-b06d-2f01a2c132a1-kube-api-access-cv7bl\") pod \"dnsmasq-dns-589bbb667-6gxlc\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.146507 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f79b\" (UniqueName: \"kubernetes.io/projected/e623b24a-64a5-4209-86bb-1814ae9c400b-kube-api-access-4f79b\") pod \"placement-db-sync-f89zl\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " pod="openstack/placement-db-sync-f89zl" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.293266 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-bxx5d" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.330548 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-f89zl" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.342225 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.388437 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-2zjg5"] Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.557423 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7796644bc-dm4ww"] Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.720202 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-gsxqb"] Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.731670 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.874868 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-gbmwh"] Mar 01 09:27:58 crc kubenswrapper[4792]: W0301 09:27:58.898564 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66aba873_81b0_452a_81f9_73cc18445180.slice/crio-0d7cc39250ba374cd76a9cab23087e1f60274ea1219843b9f8303a09038d9fc3 WatchSource:0}: Error finding container 0d7cc39250ba374cd76a9cab23087e1f60274ea1219843b9f8303a09038d9fc3: Status 404 returned error can't find the container with id 0d7cc39250ba374cd76a9cab23087e1f60274ea1219843b9f8303a09038d9fc3 Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.997717 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2zjg5" event={"ID":"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d","Type":"ContainerStarted","Data":"670f35fecbdbd6b75d71f5beac8b8d230ac85378c92ee4b0813d0d49f8a4dde5"} Mar 01 09:27:58 crc kubenswrapper[4792]: I0301 09:27:58.997758 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2zjg5" event={"ID":"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d","Type":"ContainerStarted","Data":"1f893a9e0566eb476f19571ee15d1e6b6197f05ea1b028ec4b64824306365a2d"} Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.004535 4792 generic.go:334] "Generic (PLEG): container finished" podID="30b802af-3af5-430f-b06f-709fd4606fd0" containerID="da4e59955cfda8f625c142c442b34e7bee84a99ca2a745a710222ff222c9839d" exitCode=0 Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.004620 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7796644bc-dm4ww" event={"ID":"30b802af-3af5-430f-b06f-709fd4606fd0","Type":"ContainerDied","Data":"da4e59955cfda8f625c142c442b34e7bee84a99ca2a745a710222ff222c9839d"} Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.004643 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7796644bc-dm4ww" event={"ID":"30b802af-3af5-430f-b06f-709fd4606fd0","Type":"ContainerStarted","Data":"16b663874cdd99123d93e52d5d4fca0608f91f60cad52b8d32e1ae059e5512ec"} Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.008489 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-gbmwh" event={"ID":"66aba873-81b0-452a-81f9-73cc18445180","Type":"ContainerStarted","Data":"0d7cc39250ba374cd76a9cab23087e1f60274ea1219843b9f8303a09038d9fc3"} Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.010010 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gsxqb" event={"ID":"737aa0a0-6e53-451e-9d5f-2deada87b5b4","Type":"ContainerStarted","Data":"dc7bf25ff6493b89f8d3d42eee96feaadc16025ec1b5d1ef3c591647a4fb7abf"} Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.017804 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd","Type":"ContainerStarted","Data":"4a875a1ec016948e8ea916192a157e8f35d24195db5c29c64d33740934a209c2"} Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.063193 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-2zjg5" podStartSLOduration=2.063170056 podStartE2EDuration="2.063170056s" podCreationTimestamp="2026-03-01 09:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:27:59.015147197 +0000 UTC m=+1208.257026394" watchObservedRunningTime="2026-03-01 09:27:59.063170056 +0000 UTC m=+1208.305049253" Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.103949 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-bxx5d"] Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.176101 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-f89zl"] Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.218662 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589bbb667-6gxlc"] Mar 01 09:27:59 crc kubenswrapper[4792]: W0301 09:27:59.225844 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e1a508b_9db4_414a_b06d_2f01a2c132a1.slice/crio-0fc29a339e8c09faaadb240388644d964ff0e254b0fbfad00516d578fb8bff1b WatchSource:0}: Error finding container 0fc29a339e8c09faaadb240388644d964ff0e254b0fbfad00516d578fb8bff1b: Status 404 returned error can't find the container with id 0fc29a339e8c09faaadb240388644d964ff0e254b0fbfad00516d578fb8bff1b Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.387839 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.459421 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-config\") pod \"30b802af-3af5-430f-b06f-709fd4606fd0\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.459751 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-dns-svc\") pod \"30b802af-3af5-430f-b06f-709fd4606fd0\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.459796 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-ovsdbserver-sb\") pod \"30b802af-3af5-430f-b06f-709fd4606fd0\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.459820 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mwlc\" (UniqueName: \"kubernetes.io/projected/30b802af-3af5-430f-b06f-709fd4606fd0-kube-api-access-4mwlc\") pod \"30b802af-3af5-430f-b06f-709fd4606fd0\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.459864 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-ovsdbserver-nb\") pod \"30b802af-3af5-430f-b06f-709fd4606fd0\" (UID: \"30b802af-3af5-430f-b06f-709fd4606fd0\") " Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.504121 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30b802af-3af5-430f-b06f-709fd4606fd0-kube-api-access-4mwlc" (OuterVolumeSpecName: "kube-api-access-4mwlc") pod "30b802af-3af5-430f-b06f-709fd4606fd0" (UID: "30b802af-3af5-430f-b06f-709fd4606fd0"). InnerVolumeSpecName "kube-api-access-4mwlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.514748 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "30b802af-3af5-430f-b06f-709fd4606fd0" (UID: "30b802af-3af5-430f-b06f-709fd4606fd0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.518553 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-config" (OuterVolumeSpecName: "config") pod "30b802af-3af5-430f-b06f-709fd4606fd0" (UID: "30b802af-3af5-430f-b06f-709fd4606fd0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.524480 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "30b802af-3af5-430f-b06f-709fd4606fd0" (UID: "30b802af-3af5-430f-b06f-709fd4606fd0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.527312 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "30b802af-3af5-430f-b06f-709fd4606fd0" (UID: "30b802af-3af5-430f-b06f-709fd4606fd0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.561480 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.561502 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.561513 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mwlc\" (UniqueName: \"kubernetes.io/projected/30b802af-3af5-430f-b06f-709fd4606fd0-kube-api-access-4mwlc\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.561521 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.561531 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30b802af-3af5-430f-b06f-709fd4606fd0-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:27:59 crc kubenswrapper[4792]: I0301 09:27:59.777211 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.056054 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-gbmwh" event={"ID":"66aba873-81b0-452a-81f9-73cc18445180","Type":"ContainerStarted","Data":"acaeed4a0d4cc4c819f11994601b2946e5e093014d4fc45dbb6ce057d16aef6a"} Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.058715 4792 generic.go:334] "Generic (PLEG): container finished" podID="4e1a508b-9db4-414a-b06d-2f01a2c132a1" containerID="70a638170c1ac600ae163392b053537afb8b6aa9687b87f677426f4f023db168" exitCode=0 Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.058758 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589bbb667-6gxlc" event={"ID":"4e1a508b-9db4-414a-b06d-2f01a2c132a1","Type":"ContainerDied","Data":"70a638170c1ac600ae163392b053537afb8b6aa9687b87f677426f4f023db168"} Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.058776 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589bbb667-6gxlc" event={"ID":"4e1a508b-9db4-414a-b06d-2f01a2c132a1","Type":"ContainerStarted","Data":"0fc29a339e8c09faaadb240388644d964ff0e254b0fbfad00516d578fb8bff1b"} Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.111219 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7796644bc-dm4ww" event={"ID":"30b802af-3af5-430f-b06f-709fd4606fd0","Type":"ContainerDied","Data":"16b663874cdd99123d93e52d5d4fca0608f91f60cad52b8d32e1ae059e5512ec"} Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.111267 4792 scope.go:117] "RemoveContainer" containerID="da4e59955cfda8f625c142c442b34e7bee84a99ca2a745a710222ff222c9839d" Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.111398 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7796644bc-dm4ww" Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.139399 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-gbmwh" podStartSLOduration=3.139382822 podStartE2EDuration="3.139382822s" podCreationTimestamp="2026-03-01 09:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:28:00.088556014 +0000 UTC m=+1209.330435211" watchObservedRunningTime="2026-03-01 09:28:00.139382822 +0000 UTC m=+1209.381262009" Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.197986 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-f89zl" event={"ID":"e623b24a-64a5-4209-86bb-1814ae9c400b","Type":"ContainerStarted","Data":"e4885fc9359de722bc25d23b6b1337620c1bc715e4d3353e33a0de4919152488"} Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.215325 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-bxx5d" event={"ID":"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd","Type":"ContainerStarted","Data":"5a8a6506253c42d0a8617675f0f4091a77e441fa914d2388057587c940f25850"} Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.234248 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539288-9klf4"] Mar 01 09:28:00 crc kubenswrapper[4792]: E0301 09:28:00.234584 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30b802af-3af5-430f-b06f-709fd4606fd0" containerName="init" Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.234596 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="30b802af-3af5-430f-b06f-709fd4606fd0" containerName="init" Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.234752 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="30b802af-3af5-430f-b06f-709fd4606fd0" containerName="init" Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.235248 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539288-9klf4" Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.246505 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.246702 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.276474 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.314086 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539288-9klf4"] Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.344204 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7796644bc-dm4ww"] Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.358989 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7796644bc-dm4ww"] Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.409157 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67qpx\" (UniqueName: \"kubernetes.io/projected/2e8f417d-a9b7-4969-9e24-785fa8baf9c4-kube-api-access-67qpx\") pod \"auto-csr-approver-29539288-9klf4\" (UID: \"2e8f417d-a9b7-4969-9e24-785fa8baf9c4\") " pod="openshift-infra/auto-csr-approver-29539288-9klf4" Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.525072 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67qpx\" (UniqueName: \"kubernetes.io/projected/2e8f417d-a9b7-4969-9e24-785fa8baf9c4-kube-api-access-67qpx\") pod \"auto-csr-approver-29539288-9klf4\" (UID: \"2e8f417d-a9b7-4969-9e24-785fa8baf9c4\") " pod="openshift-infra/auto-csr-approver-29539288-9klf4" Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.542171 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67qpx\" (UniqueName: \"kubernetes.io/projected/2e8f417d-a9b7-4969-9e24-785fa8baf9c4-kube-api-access-67qpx\") pod \"auto-csr-approver-29539288-9klf4\" (UID: \"2e8f417d-a9b7-4969-9e24-785fa8baf9c4\") " pod="openshift-infra/auto-csr-approver-29539288-9klf4" Mar 01 09:28:00 crc kubenswrapper[4792]: I0301 09:28:00.587155 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539288-9klf4" Mar 01 09:28:01 crc kubenswrapper[4792]: I0301 09:28:01.221509 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539288-9klf4"] Mar 01 09:28:01 crc kubenswrapper[4792]: I0301 09:28:01.249005 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589bbb667-6gxlc" event={"ID":"4e1a508b-9db4-414a-b06d-2f01a2c132a1","Type":"ContainerStarted","Data":"245d762d71bcda288abc788f3327cb41b0f2c4d918af85b1ae35b3d1615c472c"} Mar 01 09:28:01 crc kubenswrapper[4792]: I0301 09:28:01.249124 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:28:01 crc kubenswrapper[4792]: W0301 09:28:01.295637 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e8f417d_a9b7_4969_9e24_785fa8baf9c4.slice/crio-4b8eb3139f21d41482ab553230f14aad91a6b71c593740f60f59167f5d6203d0 WatchSource:0}: Error finding container 4b8eb3139f21d41482ab553230f14aad91a6b71c593740f60f59167f5d6203d0: Status 404 returned error can't find the container with id 4b8eb3139f21d41482ab553230f14aad91a6b71c593740f60f59167f5d6203d0 Mar 01 09:28:01 crc kubenswrapper[4792]: I0301 09:28:01.452218 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30b802af-3af5-430f-b06f-709fd4606fd0" path="/var/lib/kubelet/pods/30b802af-3af5-430f-b06f-709fd4606fd0/volumes" Mar 01 09:28:01 crc kubenswrapper[4792]: I0301 09:28:01.472039 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-589bbb667-6gxlc" podStartSLOduration=4.472013663 podStartE2EDuration="4.472013663s" podCreationTimestamp="2026-03-01 09:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:28:01.281875796 +0000 UTC m=+1210.523754993" watchObservedRunningTime="2026-03-01 09:28:01.472013663 +0000 UTC m=+1210.713892860" Mar 01 09:28:02 crc kubenswrapper[4792]: I0301 09:28:02.276307 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539288-9klf4" event={"ID":"2e8f417d-a9b7-4969-9e24-785fa8baf9c4","Type":"ContainerStarted","Data":"4b8eb3139f21d41482ab553230f14aad91a6b71c593740f60f59167f5d6203d0"} Mar 01 09:28:04 crc kubenswrapper[4792]: I0301 09:28:04.314604 4792 generic.go:334] "Generic (PLEG): container finished" podID="3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d" containerID="670f35fecbdbd6b75d71f5beac8b8d230ac85378c92ee4b0813d0d49f8a4dde5" exitCode=0 Mar 01 09:28:04 crc kubenswrapper[4792]: I0301 09:28:04.314649 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2zjg5" event={"ID":"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d","Type":"ContainerDied","Data":"670f35fecbdbd6b75d71f5beac8b8d230ac85378c92ee4b0813d0d49f8a4dde5"} Mar 01 09:28:05 crc kubenswrapper[4792]: I0301 09:28:05.916353 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:28:05 crc kubenswrapper[4792]: I0301 09:28:05.920712 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxwh8\" (UniqueName: \"kubernetes.io/projected/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-kube-api-access-jxwh8\") pod \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " Mar 01 09:28:05 crc kubenswrapper[4792]: I0301 09:28:05.921762 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-config-data\") pod \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " Mar 01 09:28:05 crc kubenswrapper[4792]: I0301 09:28:05.921804 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-fernet-keys\") pod \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " Mar 01 09:28:05 crc kubenswrapper[4792]: I0301 09:28:05.921845 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-credential-keys\") pod \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " Mar 01 09:28:05 crc kubenswrapper[4792]: I0301 09:28:05.921961 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-combined-ca-bundle\") pod \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " Mar 01 09:28:05 crc kubenswrapper[4792]: I0301 09:28:05.921985 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-scripts\") pod \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\" (UID: \"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d\") " Mar 01 09:28:05 crc kubenswrapper[4792]: I0301 09:28:05.926662 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-scripts" (OuterVolumeSpecName: "scripts") pod "3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d" (UID: "3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:05 crc kubenswrapper[4792]: I0301 09:28:05.928088 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-kube-api-access-jxwh8" (OuterVolumeSpecName: "kube-api-access-jxwh8") pod "3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d" (UID: "3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d"). InnerVolumeSpecName "kube-api-access-jxwh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:28:05 crc kubenswrapper[4792]: I0301 09:28:05.929565 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d" (UID: "3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:05 crc kubenswrapper[4792]: I0301 09:28:05.954199 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d" (UID: "3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:05 crc kubenswrapper[4792]: I0301 09:28:05.964640 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d" (UID: "3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:05 crc kubenswrapper[4792]: I0301 09:28:05.973368 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-config-data" (OuterVolumeSpecName: "config-data") pod "3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d" (UID: "3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.023720 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.023770 4792 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.023780 4792 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.023793 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.023806 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.023817 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxwh8\" (UniqueName: \"kubernetes.io/projected/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d-kube-api-access-jxwh8\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.336004 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2zjg5" event={"ID":"3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d","Type":"ContainerDied","Data":"1f893a9e0566eb476f19571ee15d1e6b6197f05ea1b028ec4b64824306365a2d"} Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.336042 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f893a9e0566eb476f19571ee15d1e6b6197f05ea1b028ec4b64824306365a2d" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.336079 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2zjg5" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.423015 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-2zjg5"] Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.430996 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-2zjg5"] Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.513577 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-m9l5f"] Mar 01 09:28:06 crc kubenswrapper[4792]: E0301 09:28:06.513954 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d" containerName="keystone-bootstrap" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.513973 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d" containerName="keystone-bootstrap" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.514130 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d" containerName="keystone-bootstrap" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.514620 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.516375 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-fr9vh" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.516726 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.516929 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.518952 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.522275 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.533164 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-m9l5f"] Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.633418 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-scripts\") pod \"keystone-bootstrap-m9l5f\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.633479 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l92rl\" (UniqueName: \"kubernetes.io/projected/7108e9ac-8215-41ca-ac84-3b3851142a42-kube-api-access-l92rl\") pod \"keystone-bootstrap-m9l5f\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.633697 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-config-data\") pod \"keystone-bootstrap-m9l5f\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.633767 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-credential-keys\") pod \"keystone-bootstrap-m9l5f\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.633830 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-fernet-keys\") pod \"keystone-bootstrap-m9l5f\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.633942 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-combined-ca-bundle\") pod \"keystone-bootstrap-m9l5f\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.735779 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-fernet-keys\") pod \"keystone-bootstrap-m9l5f\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.735868 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-combined-ca-bundle\") pod \"keystone-bootstrap-m9l5f\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.735935 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-scripts\") pod \"keystone-bootstrap-m9l5f\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.735964 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l92rl\" (UniqueName: \"kubernetes.io/projected/7108e9ac-8215-41ca-ac84-3b3851142a42-kube-api-access-l92rl\") pod \"keystone-bootstrap-m9l5f\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.736011 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-config-data\") pod \"keystone-bootstrap-m9l5f\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.736031 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-credential-keys\") pod \"keystone-bootstrap-m9l5f\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.741852 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-config-data\") pod \"keystone-bootstrap-m9l5f\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.742413 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-credential-keys\") pod \"keystone-bootstrap-m9l5f\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.743758 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-scripts\") pod \"keystone-bootstrap-m9l5f\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.743861 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-fernet-keys\") pod \"keystone-bootstrap-m9l5f\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.745105 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-combined-ca-bundle\") pod \"keystone-bootstrap-m9l5f\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.751853 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l92rl\" (UniqueName: \"kubernetes.io/projected/7108e9ac-8215-41ca-ac84-3b3851142a42-kube-api-access-l92rl\") pod \"keystone-bootstrap-m9l5f\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:06 crc kubenswrapper[4792]: I0301 09:28:06.893336 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:07 crc kubenswrapper[4792]: I0301 09:28:07.419776 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d" path="/var/lib/kubelet/pods/3b42d3d2-f423-4f22-9bc7-7ba4ede5b61d/volumes" Mar 01 09:28:08 crc kubenswrapper[4792]: I0301 09:28:08.343746 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:28:08 crc kubenswrapper[4792]: E0301 09:28:08.353491 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddabb3d2e_57fa_4ad3_9f3b_b85e0b670650.slice/crio-212a58af65a44bd380179af8061fb1773b3c54204d964227d1d4aa6b00521785\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddabb3d2e_57fa_4ad3_9f3b_b85e0b670650.slice\": RecentStats: unable to find data in memory cache]" Mar 01 09:28:08 crc kubenswrapper[4792]: I0301 09:28:08.411325 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6958f867f9-p8x5s"] Mar 01 09:28:08 crc kubenswrapper[4792]: I0301 09:28:08.411771 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" podUID="667fff68-7113-4dfe-86b4-34b80b41d326" containerName="dnsmasq-dns" containerID="cri-o://b035d55e8df5d946580d2dfd8c4b3f7b4f07c6856584d562fa0676ed055fd3e5" gracePeriod=10 Mar 01 09:28:09 crc kubenswrapper[4792]: I0301 09:28:09.364894 4792 generic.go:334] "Generic (PLEG): container finished" podID="667fff68-7113-4dfe-86b4-34b80b41d326" containerID="b035d55e8df5d946580d2dfd8c4b3f7b4f07c6856584d562fa0676ed055fd3e5" exitCode=0 Mar 01 09:28:09 crc kubenswrapper[4792]: I0301 09:28:09.364947 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" event={"ID":"667fff68-7113-4dfe-86b4-34b80b41d326","Type":"ContainerDied","Data":"b035d55e8df5d946580d2dfd8c4b3f7b4f07c6856584d562fa0676ed055fd3e5"} Mar 01 09:28:11 crc kubenswrapper[4792]: I0301 09:28:11.418077 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" podUID="667fff68-7113-4dfe-86b4-34b80b41d326" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: connect: connection refused" Mar 01 09:28:16 crc kubenswrapper[4792]: I0301 09:28:16.417262 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" podUID="667fff68-7113-4dfe-86b4-34b80b41d326" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: connect: connection refused" Mar 01 09:28:18 crc kubenswrapper[4792]: E0301 09:28:18.550597 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddabb3d2e_57fa_4ad3_9f3b_b85e0b670650.slice/crio-212a58af65a44bd380179af8061fb1773b3c54204d964227d1d4aa6b00521785\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddabb3d2e_57fa_4ad3_9f3b_b85e0b670650.slice\": RecentStats: unable to find data in memory cache]" Mar 01 09:28:22 crc kubenswrapper[4792]: E0301 09:28:22.148974 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:2c52c0f4b4baa15796eb284522adff7fa9e5c85a2d77c2e47ef4afdf8e4a7c7d" Mar 01 09:28:22 crc kubenswrapper[4792]: E0301 09:28:22.149609 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:2c52c0f4b4baa15796eb284522adff7fa9e5c85a2d77c2e47ef4afdf8e4a7c7d,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pzxv7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-bxx5d_openstack(9e6bad7a-881b-4ef4-9916-f447e2fc1ffd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:28:22 crc kubenswrapper[4792]: E0301 09:28:22.150871 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-bxx5d" podUID="9e6bad7a-881b-4ef4-9916-f447e2fc1ffd" Mar 01 09:28:22 crc kubenswrapper[4792]: E0301 09:28:22.483409 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:2c52c0f4b4baa15796eb284522adff7fa9e5c85a2d77c2e47ef4afdf8e4a7c7d\\\"\"" pod="openstack/barbican-db-sync-bxx5d" podUID="9e6bad7a-881b-4ef4-9916-f447e2fc1ffd" Mar 01 09:28:23 crc kubenswrapper[4792]: E0301 09:28:23.092364 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838" Mar 01 09:28:23 crc kubenswrapper[4792]: E0301 09:28:23.092739 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-84xmq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-gsxqb_openstack(737aa0a0-6e53-451e-9d5f-2deada87b5b4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 09:28:23 crc kubenswrapper[4792]: E0301 09:28:23.093997 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-gsxqb" podUID="737aa0a0-6e53-451e-9d5f-2deada87b5b4" Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.341623 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.454778 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-config\") pod \"667fff68-7113-4dfe-86b4-34b80b41d326\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.454855 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77bc7\" (UniqueName: \"kubernetes.io/projected/667fff68-7113-4dfe-86b4-34b80b41d326-kube-api-access-77bc7\") pod \"667fff68-7113-4dfe-86b4-34b80b41d326\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.455233 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-ovsdbserver-sb\") pod \"667fff68-7113-4dfe-86b4-34b80b41d326\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.455314 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-dns-svc\") pod \"667fff68-7113-4dfe-86b4-34b80b41d326\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.455458 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-ovsdbserver-nb\") pod \"667fff68-7113-4dfe-86b4-34b80b41d326\" (UID: \"667fff68-7113-4dfe-86b4-34b80b41d326\") " Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.464588 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.469262 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/667fff68-7113-4dfe-86b4-34b80b41d326-kube-api-access-77bc7" (OuterVolumeSpecName: "kube-api-access-77bc7") pod "667fff68-7113-4dfe-86b4-34b80b41d326" (UID: "667fff68-7113-4dfe-86b4-34b80b41d326"). InnerVolumeSpecName "kube-api-access-77bc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.494046 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-f89zl" event={"ID":"e623b24a-64a5-4209-86bb-1814ae9c400b","Type":"ContainerStarted","Data":"fded697acef2e4939680efd28c30dd0707c1b449f6152a36c981b92695845052"} Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.496747 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" event={"ID":"667fff68-7113-4dfe-86b4-34b80b41d326","Type":"ContainerDied","Data":"386c10f9f902b49dfdcc7e63fa9588772c851427627a00112a847f427ef0dd79"} Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.496809 4792 scope.go:117] "RemoveContainer" containerID="b035d55e8df5d946580d2dfd8c4b3f7b4f07c6856584d562fa0676ed055fd3e5" Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.496964 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.501059 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd","Type":"ContainerStarted","Data":"4fd86a535781157d736a326c5d3973270ef9e75f90ca5c7a184728477646f601"} Mar 01 09:28:23 crc kubenswrapper[4792]: E0301 09:28:23.502064 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838\\\"\"" pod="openstack/cinder-db-sync-gsxqb" podUID="737aa0a0-6e53-451e-9d5f-2deada87b5b4" Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.513392 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-f89zl" podStartSLOduration=2.60839733 podStartE2EDuration="26.513342096s" podCreationTimestamp="2026-03-01 09:27:57 +0000 UTC" firstStartedPulling="2026-03-01 09:27:59.179601974 +0000 UTC m=+1208.421481171" lastFinishedPulling="2026-03-01 09:28:23.08454674 +0000 UTC m=+1232.326425937" observedRunningTime="2026-03-01 09:28:23.509556833 +0000 UTC m=+1232.751436030" watchObservedRunningTime="2026-03-01 09:28:23.513342096 +0000 UTC m=+1232.755221293" Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.520464 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "667fff68-7113-4dfe-86b4-34b80b41d326" (UID: "667fff68-7113-4dfe-86b4-34b80b41d326"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.524247 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-config" (OuterVolumeSpecName: "config") pod "667fff68-7113-4dfe-86b4-34b80b41d326" (UID: "667fff68-7113-4dfe-86b4-34b80b41d326"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.544703 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "667fff68-7113-4dfe-86b4-34b80b41d326" (UID: "667fff68-7113-4dfe-86b4-34b80b41d326"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.548704 4792 scope.go:117] "RemoveContainer" containerID="e23b7bf7394271fedc5e4abedff3f86bcbcc1a5fe82a8164474dcf3fb06696d9" Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.555718 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "667fff68-7113-4dfe-86b4-34b80b41d326" (UID: "667fff68-7113-4dfe-86b4-34b80b41d326"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.556490 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-m9l5f"] Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.559748 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.560985 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77bc7\" (UniqueName: \"kubernetes.io/projected/667fff68-7113-4dfe-86b4-34b80b41d326-kube-api-access-77bc7\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.561024 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.561037 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.561090 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/667fff68-7113-4dfe-86b4-34b80b41d326-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.829322 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6958f867f9-p8x5s"] Mar 01 09:28:23 crc kubenswrapper[4792]: I0301 09:28:23.858591 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6958f867f9-p8x5s"] Mar 01 09:28:24 crc kubenswrapper[4792]: I0301 09:28:24.516947 4792 generic.go:334] "Generic (PLEG): container finished" podID="2e8f417d-a9b7-4969-9e24-785fa8baf9c4" containerID="0b4398286a53ae92983ef93db19480d6804e4b83a997761fc68f16627e65ecd5" exitCode=0 Mar 01 09:28:24 crc kubenswrapper[4792]: I0301 09:28:24.517024 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539288-9klf4" event={"ID":"2e8f417d-a9b7-4969-9e24-785fa8baf9c4","Type":"ContainerDied","Data":"0b4398286a53ae92983ef93db19480d6804e4b83a997761fc68f16627e65ecd5"} Mar 01 09:28:24 crc kubenswrapper[4792]: I0301 09:28:24.519966 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m9l5f" event={"ID":"7108e9ac-8215-41ca-ac84-3b3851142a42","Type":"ContainerStarted","Data":"4a5e793bcbd54f67d2aa56894763cca0ce1c06ab0ab5c25152dbd8e3b2985066"} Mar 01 09:28:24 crc kubenswrapper[4792]: I0301 09:28:24.520010 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m9l5f" event={"ID":"7108e9ac-8215-41ca-ac84-3b3851142a42","Type":"ContainerStarted","Data":"2b1de65ec699fbf1baf52d26468fee288183e04f12523b93783a52b6e1c65a17"} Mar 01 09:28:24 crc kubenswrapper[4792]: I0301 09:28:24.551954 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-m9l5f" podStartSLOduration=18.551938049 podStartE2EDuration="18.551938049s" podCreationTimestamp="2026-03-01 09:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:28:24.550015492 +0000 UTC m=+1233.791894689" watchObservedRunningTime="2026-03-01 09:28:24.551938049 +0000 UTC m=+1233.793817236" Mar 01 09:28:25 crc kubenswrapper[4792]: I0301 09:28:25.422100 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="667fff68-7113-4dfe-86b4-34b80b41d326" path="/var/lib/kubelet/pods/667fff68-7113-4dfe-86b4-34b80b41d326/volumes" Mar 01 09:28:25 crc kubenswrapper[4792]: I0301 09:28:25.527985 4792 generic.go:334] "Generic (PLEG): container finished" podID="66aba873-81b0-452a-81f9-73cc18445180" containerID="acaeed4a0d4cc4c819f11994601b2946e5e093014d4fc45dbb6ce057d16aef6a" exitCode=0 Mar 01 09:28:25 crc kubenswrapper[4792]: I0301 09:28:25.528059 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-gbmwh" event={"ID":"66aba873-81b0-452a-81f9-73cc18445180","Type":"ContainerDied","Data":"acaeed4a0d4cc4c819f11994601b2946e5e093014d4fc45dbb6ce057d16aef6a"} Mar 01 09:28:25 crc kubenswrapper[4792]: I0301 09:28:25.529690 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd","Type":"ContainerStarted","Data":"ac3d91f96c6efaf7baa089ecdf84d5d3fe923f61545b960c7a1aa1d77e8db2e5"} Mar 01 09:28:25 crc kubenswrapper[4792]: I0301 09:28:25.905329 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539288-9klf4" Mar 01 09:28:26 crc kubenswrapper[4792]: I0301 09:28:26.103669 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67qpx\" (UniqueName: \"kubernetes.io/projected/2e8f417d-a9b7-4969-9e24-785fa8baf9c4-kube-api-access-67qpx\") pod \"2e8f417d-a9b7-4969-9e24-785fa8baf9c4\" (UID: \"2e8f417d-a9b7-4969-9e24-785fa8baf9c4\") " Mar 01 09:28:26 crc kubenswrapper[4792]: I0301 09:28:26.114847 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e8f417d-a9b7-4969-9e24-785fa8baf9c4-kube-api-access-67qpx" (OuterVolumeSpecName: "kube-api-access-67qpx") pod "2e8f417d-a9b7-4969-9e24-785fa8baf9c4" (UID: "2e8f417d-a9b7-4969-9e24-785fa8baf9c4"). InnerVolumeSpecName "kube-api-access-67qpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:28:26 crc kubenswrapper[4792]: I0301 09:28:26.205392 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67qpx\" (UniqueName: \"kubernetes.io/projected/2e8f417d-a9b7-4969-9e24-785fa8baf9c4-kube-api-access-67qpx\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:26 crc kubenswrapper[4792]: I0301 09:28:26.417659 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6958f867f9-p8x5s" podUID="667fff68-7113-4dfe-86b4-34b80b41d326" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: i/o timeout" Mar 01 09:28:26 crc kubenswrapper[4792]: I0301 09:28:26.538433 4792 generic.go:334] "Generic (PLEG): container finished" podID="e623b24a-64a5-4209-86bb-1814ae9c400b" containerID="fded697acef2e4939680efd28c30dd0707c1b449f6152a36c981b92695845052" exitCode=0 Mar 01 09:28:26 crc kubenswrapper[4792]: I0301 09:28:26.538495 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-f89zl" event={"ID":"e623b24a-64a5-4209-86bb-1814ae9c400b","Type":"ContainerDied","Data":"fded697acef2e4939680efd28c30dd0707c1b449f6152a36c981b92695845052"} Mar 01 09:28:26 crc kubenswrapper[4792]: I0301 09:28:26.541761 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539288-9klf4" event={"ID":"2e8f417d-a9b7-4969-9e24-785fa8baf9c4","Type":"ContainerDied","Data":"4b8eb3139f21d41482ab553230f14aad91a6b71c593740f60f59167f5d6203d0"} Mar 01 09:28:26 crc kubenswrapper[4792]: I0301 09:28:26.541805 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539288-9klf4" Mar 01 09:28:26 crc kubenswrapper[4792]: I0301 09:28:26.541813 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b8eb3139f21d41482ab553230f14aad91a6b71c593740f60f59167f5d6203d0" Mar 01 09:28:26 crc kubenswrapper[4792]: I0301 09:28:26.953067 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-gbmwh" Mar 01 09:28:26 crc kubenswrapper[4792]: I0301 09:28:26.966890 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539282-dcfkb"] Mar 01 09:28:26 crc kubenswrapper[4792]: I0301 09:28:26.975051 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539282-dcfkb"] Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.119155 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/66aba873-81b0-452a-81f9-73cc18445180-config\") pod \"66aba873-81b0-452a-81f9-73cc18445180\" (UID: \"66aba873-81b0-452a-81f9-73cc18445180\") " Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.119248 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66aba873-81b0-452a-81f9-73cc18445180-combined-ca-bundle\") pod \"66aba873-81b0-452a-81f9-73cc18445180\" (UID: \"66aba873-81b0-452a-81f9-73cc18445180\") " Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.119343 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95d98\" (UniqueName: \"kubernetes.io/projected/66aba873-81b0-452a-81f9-73cc18445180-kube-api-access-95d98\") pod \"66aba873-81b0-452a-81f9-73cc18445180\" (UID: \"66aba873-81b0-452a-81f9-73cc18445180\") " Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.142237 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66aba873-81b0-452a-81f9-73cc18445180-kube-api-access-95d98" (OuterVolumeSpecName: "kube-api-access-95d98") pod "66aba873-81b0-452a-81f9-73cc18445180" (UID: "66aba873-81b0-452a-81f9-73cc18445180"). InnerVolumeSpecName "kube-api-access-95d98". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.149625 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66aba873-81b0-452a-81f9-73cc18445180-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66aba873-81b0-452a-81f9-73cc18445180" (UID: "66aba873-81b0-452a-81f9-73cc18445180"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.152128 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66aba873-81b0-452a-81f9-73cc18445180-config" (OuterVolumeSpecName: "config") pod "66aba873-81b0-452a-81f9-73cc18445180" (UID: "66aba873-81b0-452a-81f9-73cc18445180"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.221545 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/66aba873-81b0-452a-81f9-73cc18445180-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.221749 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66aba873-81b0-452a-81f9-73cc18445180-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.221831 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95d98\" (UniqueName: \"kubernetes.io/projected/66aba873-81b0-452a-81f9-73cc18445180-kube-api-access-95d98\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.419967 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fcb7c96-6ab5-413c-b776-d1bc938e85c0" path="/var/lib/kubelet/pods/1fcb7c96-6ab5-413c-b776-d1bc938e85c0/volumes" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.554402 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-gbmwh" event={"ID":"66aba873-81b0-452a-81f9-73cc18445180","Type":"ContainerDied","Data":"0d7cc39250ba374cd76a9cab23087e1f60274ea1219843b9f8303a09038d9fc3"} Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.554445 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-gbmwh" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.554471 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d7cc39250ba374cd76a9cab23087e1f60274ea1219843b9f8303a09038d9fc3" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.849946 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76fc8568c7-drpr8"] Mar 01 09:28:27 crc kubenswrapper[4792]: E0301 09:28:27.850926 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e8f417d-a9b7-4969-9e24-785fa8baf9c4" containerName="oc" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.850944 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e8f417d-a9b7-4969-9e24-785fa8baf9c4" containerName="oc" Mar 01 09:28:27 crc kubenswrapper[4792]: E0301 09:28:27.850971 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="667fff68-7113-4dfe-86b4-34b80b41d326" containerName="init" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.850978 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="667fff68-7113-4dfe-86b4-34b80b41d326" containerName="init" Mar 01 09:28:27 crc kubenswrapper[4792]: E0301 09:28:27.853462 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="667fff68-7113-4dfe-86b4-34b80b41d326" containerName="dnsmasq-dns" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.853482 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="667fff68-7113-4dfe-86b4-34b80b41d326" containerName="dnsmasq-dns" Mar 01 09:28:27 crc kubenswrapper[4792]: E0301 09:28:27.853498 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66aba873-81b0-452a-81f9-73cc18445180" containerName="neutron-db-sync" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.853506 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="66aba873-81b0-452a-81f9-73cc18445180" containerName="neutron-db-sync" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.853959 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="667fff68-7113-4dfe-86b4-34b80b41d326" containerName="dnsmasq-dns" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.853978 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="66aba873-81b0-452a-81f9-73cc18445180" containerName="neutron-db-sync" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.853992 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e8f417d-a9b7-4969-9e24-785fa8baf9c4" containerName="oc" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.857307 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.911035 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fc8568c7-drpr8"] Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.956615 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-dns-svc\") pod \"dnsmasq-dns-76fc8568c7-drpr8\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.956674 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-ovsdbserver-nb\") pod \"dnsmasq-dns-76fc8568c7-drpr8\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.956695 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-ovsdbserver-sb\") pod \"dnsmasq-dns-76fc8568c7-drpr8\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.956762 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-config\") pod \"dnsmasq-dns-76fc8568c7-drpr8\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.956785 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg4gm\" (UniqueName: \"kubernetes.io/projected/82467164-5e77-4ea0-beee-b3a70126c075-kube-api-access-hg4gm\") pod \"dnsmasq-dns-76fc8568c7-drpr8\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.964093 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7bbc5b86d6-8b672"] Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.966447 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.972618 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.973003 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.973138 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.973439 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-k82sx" Mar 01 09:28:27 crc kubenswrapper[4792]: I0301 09:28:27.981491 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7bbc5b86d6-8b672"] Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.059514 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-config\") pod \"dnsmasq-dns-76fc8568c7-drpr8\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.059586 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg4gm\" (UniqueName: \"kubernetes.io/projected/82467164-5e77-4ea0-beee-b3a70126c075-kube-api-access-hg4gm\") pod \"dnsmasq-dns-76fc8568c7-drpr8\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.059650 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-dns-svc\") pod \"dnsmasq-dns-76fc8568c7-drpr8\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.059699 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-ovsdbserver-nb\") pod \"dnsmasq-dns-76fc8568c7-drpr8\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.059722 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-ovsdbserver-sb\") pod \"dnsmasq-dns-76fc8568c7-drpr8\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.060787 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-ovsdbserver-sb\") pod \"dnsmasq-dns-76fc8568c7-drpr8\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.062795 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-dns-svc\") pod \"dnsmasq-dns-76fc8568c7-drpr8\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.062986 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-ovsdbserver-nb\") pod \"dnsmasq-dns-76fc8568c7-drpr8\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.063182 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-config\") pod \"dnsmasq-dns-76fc8568c7-drpr8\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.109678 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg4gm\" (UniqueName: \"kubernetes.io/projected/82467164-5e77-4ea0-beee-b3a70126c075-kube-api-access-hg4gm\") pod \"dnsmasq-dns-76fc8568c7-drpr8\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.160764 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsctq\" (UniqueName: \"kubernetes.io/projected/013566fd-5627-422a-809a-e81a8ec059d9-kube-api-access-hsctq\") pod \"neutron-7bbc5b86d6-8b672\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.160872 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-combined-ca-bundle\") pod \"neutron-7bbc5b86d6-8b672\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.160919 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-config\") pod \"neutron-7bbc5b86d6-8b672\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.160964 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-ovndb-tls-certs\") pod \"neutron-7bbc5b86d6-8b672\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.161004 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-httpd-config\") pod \"neutron-7bbc5b86d6-8b672\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.211897 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.261955 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-combined-ca-bundle\") pod \"neutron-7bbc5b86d6-8b672\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.262018 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-config\") pod \"neutron-7bbc5b86d6-8b672\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.262150 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-ovndb-tls-certs\") pod \"neutron-7bbc5b86d6-8b672\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.262192 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-httpd-config\") pod \"neutron-7bbc5b86d6-8b672\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.262236 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsctq\" (UniqueName: \"kubernetes.io/projected/013566fd-5627-422a-809a-e81a8ec059d9-kube-api-access-hsctq\") pod \"neutron-7bbc5b86d6-8b672\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.267025 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-ovndb-tls-certs\") pod \"neutron-7bbc5b86d6-8b672\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.267451 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-httpd-config\") pod \"neutron-7bbc5b86d6-8b672\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.268217 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-config\") pod \"neutron-7bbc5b86d6-8b672\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.269953 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-combined-ca-bundle\") pod \"neutron-7bbc5b86d6-8b672\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.288457 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsctq\" (UniqueName: \"kubernetes.io/projected/013566fd-5627-422a-809a-e81a8ec059d9-kube-api-access-hsctq\") pod \"neutron-7bbc5b86d6-8b672\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.296682 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.570729 4792 generic.go:334] "Generic (PLEG): container finished" podID="7108e9ac-8215-41ca-ac84-3b3851142a42" containerID="4a5e793bcbd54f67d2aa56894763cca0ce1c06ab0ab5c25152dbd8e3b2985066" exitCode=0 Mar 01 09:28:28 crc kubenswrapper[4792]: I0301 09:28:28.570767 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m9l5f" event={"ID":"7108e9ac-8215-41ca-ac84-3b3851142a42","Type":"ContainerDied","Data":"4a5e793bcbd54f67d2aa56894763cca0ce1c06ab0ab5c25152dbd8e3b2985066"} Mar 01 09:28:28 crc kubenswrapper[4792]: E0301 09:28:28.763062 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddabb3d2e_57fa_4ad3_9f3b_b85e0b670650.slice/crio-212a58af65a44bd380179af8061fb1773b3c54204d964227d1d4aa6b00521785\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddabb3d2e_57fa_4ad3_9f3b_b85e0b670650.slice\": RecentStats: unable to find data in memory cache]" Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.281449 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-f89zl" Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.389678 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e623b24a-64a5-4209-86bb-1814ae9c400b-logs\") pod \"e623b24a-64a5-4209-86bb-1814ae9c400b\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.389755 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f79b\" (UniqueName: \"kubernetes.io/projected/e623b24a-64a5-4209-86bb-1814ae9c400b-kube-api-access-4f79b\") pod \"e623b24a-64a5-4209-86bb-1814ae9c400b\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.389816 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e623b24a-64a5-4209-86bb-1814ae9c400b-scripts\") pod \"e623b24a-64a5-4209-86bb-1814ae9c400b\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.390358 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e623b24a-64a5-4209-86bb-1814ae9c400b-logs" (OuterVolumeSpecName: "logs") pod "e623b24a-64a5-4209-86bb-1814ae9c400b" (UID: "e623b24a-64a5-4209-86bb-1814ae9c400b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.390674 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e623b24a-64a5-4209-86bb-1814ae9c400b-combined-ca-bundle\") pod \"e623b24a-64a5-4209-86bb-1814ae9c400b\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.390711 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e623b24a-64a5-4209-86bb-1814ae9c400b-config-data\") pod \"e623b24a-64a5-4209-86bb-1814ae9c400b\" (UID: \"e623b24a-64a5-4209-86bb-1814ae9c400b\") " Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.391174 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e623b24a-64a5-4209-86bb-1814ae9c400b-logs\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.401019 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e623b24a-64a5-4209-86bb-1814ae9c400b-scripts" (OuterVolumeSpecName: "scripts") pod "e623b24a-64a5-4209-86bb-1814ae9c400b" (UID: "e623b24a-64a5-4209-86bb-1814ae9c400b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.401086 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e623b24a-64a5-4209-86bb-1814ae9c400b-kube-api-access-4f79b" (OuterVolumeSpecName: "kube-api-access-4f79b") pod "e623b24a-64a5-4209-86bb-1814ae9c400b" (UID: "e623b24a-64a5-4209-86bb-1814ae9c400b"). InnerVolumeSpecName "kube-api-access-4f79b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.430476 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e623b24a-64a5-4209-86bb-1814ae9c400b-config-data" (OuterVolumeSpecName: "config-data") pod "e623b24a-64a5-4209-86bb-1814ae9c400b" (UID: "e623b24a-64a5-4209-86bb-1814ae9c400b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.430500 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e623b24a-64a5-4209-86bb-1814ae9c400b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e623b24a-64a5-4209-86bb-1814ae9c400b" (UID: "e623b24a-64a5-4209-86bb-1814ae9c400b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.492599 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4f79b\" (UniqueName: \"kubernetes.io/projected/e623b24a-64a5-4209-86bb-1814ae9c400b-kube-api-access-4f79b\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.492632 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e623b24a-64a5-4209-86bb-1814ae9c400b-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.492643 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e623b24a-64a5-4209-86bb-1814ae9c400b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.492651 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e623b24a-64a5-4209-86bb-1814ae9c400b-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.580929 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-f89zl" Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.580950 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-f89zl" event={"ID":"e623b24a-64a5-4209-86bb-1814ae9c400b","Type":"ContainerDied","Data":"e4885fc9359de722bc25d23b6b1337620c1bc715e4d3353e33a0de4919152488"} Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.581031 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4885fc9359de722bc25d23b6b1337620c1bc715e4d3353e33a0de4919152488" Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.995672 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-66955dfdb5-6j2wx"] Mar 01 09:28:29 crc kubenswrapper[4792]: E0301 09:28:29.996663 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e623b24a-64a5-4209-86bb-1814ae9c400b" containerName="placement-db-sync" Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.996736 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e623b24a-64a5-4209-86bb-1814ae9c400b" containerName="placement-db-sync" Mar 01 09:28:29 crc kubenswrapper[4792]: I0301 09:28:29.996954 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e623b24a-64a5-4209-86bb-1814ae9c400b" containerName="placement-db-sync" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:29.997823 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.011226 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-66955dfdb5-6j2wx"] Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.011674 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.012065 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.104797 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-public-tls-certs\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.104870 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-httpd-config\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.104983 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8fsh\" (UniqueName: \"kubernetes.io/projected/90e1a395-ebc6-49ca-9924-c64283c12ec4-kube-api-access-h8fsh\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.105017 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-config\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.105035 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-combined-ca-bundle\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.105089 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-internal-tls-certs\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.105145 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-ovndb-tls-certs\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.207050 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-ovndb-tls-certs\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.207256 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-public-tls-certs\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.207305 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-httpd-config\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.207345 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8fsh\" (UniqueName: \"kubernetes.io/projected/90e1a395-ebc6-49ca-9924-c64283c12ec4-kube-api-access-h8fsh\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.207385 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-config\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.207401 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-combined-ca-bundle\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.207434 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-internal-tls-certs\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.213016 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-config\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.213575 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-public-tls-certs\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.216229 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-httpd-config\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.226616 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-ovndb-tls-certs\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.227434 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-internal-tls-certs\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.237573 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-combined-ca-bundle\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.239053 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8fsh\" (UniqueName: \"kubernetes.io/projected/90e1a395-ebc6-49ca-9924-c64283c12ec4-kube-api-access-h8fsh\") pod \"neutron-66955dfdb5-6j2wx\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.321819 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.386810 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7f7447dcd6-cpnn5"] Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.388099 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.394229 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.394250 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.394575 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.394803 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.394967 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-z8zjf" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.460205 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-public-tls-certs\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.460260 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-combined-ca-bundle\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.460308 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9trll\" (UniqueName: \"kubernetes.io/projected/947b32da-5664-42ff-a665-ac182dea1433-kube-api-access-9trll\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.460348 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-config-data\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.460401 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-internal-tls-certs\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.460464 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-scripts\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.460525 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/947b32da-5664-42ff-a665-ac182dea1433-logs\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.482587 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7f7447dcd6-cpnn5"] Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.563845 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-scripts\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.563937 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/947b32da-5664-42ff-a665-ac182dea1433-logs\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.563967 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-public-tls-certs\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.564495 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-combined-ca-bundle\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.565018 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/947b32da-5664-42ff-a665-ac182dea1433-logs\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.565046 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9trll\" (UniqueName: \"kubernetes.io/projected/947b32da-5664-42ff-a665-ac182dea1433-kube-api-access-9trll\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.565102 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-config-data\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.565190 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-internal-tls-certs\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.568368 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-scripts\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.568955 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-combined-ca-bundle\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.569250 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-public-tls-certs\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.577572 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-config-data\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.577980 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-internal-tls-certs\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.600469 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9trll\" (UniqueName: \"kubernetes.io/projected/947b32da-5664-42ff-a665-ac182dea1433-kube-api-access-9trll\") pod \"placement-7f7447dcd6-cpnn5\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:30 crc kubenswrapper[4792]: I0301 09:28:30.763971 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.036533 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.196634 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-credential-keys\") pod \"7108e9ac-8215-41ca-ac84-3b3851142a42\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.196687 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l92rl\" (UniqueName: \"kubernetes.io/projected/7108e9ac-8215-41ca-ac84-3b3851142a42-kube-api-access-l92rl\") pod \"7108e9ac-8215-41ca-ac84-3b3851142a42\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.196752 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-combined-ca-bundle\") pod \"7108e9ac-8215-41ca-ac84-3b3851142a42\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.196775 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-config-data\") pod \"7108e9ac-8215-41ca-ac84-3b3851142a42\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.196802 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-fernet-keys\") pod \"7108e9ac-8215-41ca-ac84-3b3851142a42\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.196918 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-scripts\") pod \"7108e9ac-8215-41ca-ac84-3b3851142a42\" (UID: \"7108e9ac-8215-41ca-ac84-3b3851142a42\") " Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.205756 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-scripts" (OuterVolumeSpecName: "scripts") pod "7108e9ac-8215-41ca-ac84-3b3851142a42" (UID: "7108e9ac-8215-41ca-ac84-3b3851142a42"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.214900 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7108e9ac-8215-41ca-ac84-3b3851142a42" (UID: "7108e9ac-8215-41ca-ac84-3b3851142a42"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.216408 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "7108e9ac-8215-41ca-ac84-3b3851142a42" (UID: "7108e9ac-8215-41ca-ac84-3b3851142a42"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.219051 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7108e9ac-8215-41ca-ac84-3b3851142a42-kube-api-access-l92rl" (OuterVolumeSpecName: "kube-api-access-l92rl") pod "7108e9ac-8215-41ca-ac84-3b3851142a42" (UID: "7108e9ac-8215-41ca-ac84-3b3851142a42"). InnerVolumeSpecName "kube-api-access-l92rl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.272971 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-config-data" (OuterVolumeSpecName: "config-data") pod "7108e9ac-8215-41ca-ac84-3b3851142a42" (UID: "7108e9ac-8215-41ca-ac84-3b3851142a42"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.288639 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7108e9ac-8215-41ca-ac84-3b3851142a42" (UID: "7108e9ac-8215-41ca-ac84-3b3851142a42"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.300339 4792 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.300367 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l92rl\" (UniqueName: \"kubernetes.io/projected/7108e9ac-8215-41ca-ac84-3b3851142a42-kube-api-access-l92rl\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.300379 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.300388 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.300397 4792 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.300406 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7108e9ac-8215-41ca-ac84-3b3851142a42-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.605129 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m9l5f" event={"ID":"7108e9ac-8215-41ca-ac84-3b3851142a42","Type":"ContainerDied","Data":"2b1de65ec699fbf1baf52d26468fee288183e04f12523b93783a52b6e1c65a17"} Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.605166 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b1de65ec699fbf1baf52d26468fee288183e04f12523b93783a52b6e1c65a17" Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.605224 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m9l5f" Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.608251 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd","Type":"ContainerStarted","Data":"10b8b301db81f96185e1ca933d6d257371da7cfb2a533a1434c80a6ff2a5895f"} Mar 01 09:28:32 crc kubenswrapper[4792]: W0301 09:28:32.702156 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod947b32da_5664_42ff_a665_ac182dea1433.slice/crio-5d91235d937c34cd2f7190314bb2d67a56ac04c2b63da01a91c659eee91c9179 WatchSource:0}: Error finding container 5d91235d937c34cd2f7190314bb2d67a56ac04c2b63da01a91c659eee91c9179: Status 404 returned error can't find the container with id 5d91235d937c34cd2f7190314bb2d67a56ac04c2b63da01a91c659eee91c9179 Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.707205 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7f7447dcd6-cpnn5"] Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.908474 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-66955dfdb5-6j2wx"] Mar 01 09:28:32 crc kubenswrapper[4792]: I0301 09:28:32.940101 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fc8568c7-drpr8"] Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.035252 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7bbc5b86d6-8b672"] Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.184450 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-749f685d77-ggsln"] Mar 01 09:28:33 crc kubenswrapper[4792]: E0301 09:28:33.186046 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7108e9ac-8215-41ca-ac84-3b3851142a42" containerName="keystone-bootstrap" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.186331 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7108e9ac-8215-41ca-ac84-3b3851142a42" containerName="keystone-bootstrap" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.186581 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7108e9ac-8215-41ca-ac84-3b3851142a42" containerName="keystone-bootstrap" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.188163 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.191105 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.191549 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-fr9vh" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.191722 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.192416 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.192711 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.192782 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.211604 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-749f685d77-ggsln"] Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.235652 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-credential-keys\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.235719 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkxsj\" (UniqueName: \"kubernetes.io/projected/b60e7776-3e2a-4e08-900d-cd39a29a78bc-kube-api-access-rkxsj\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.235864 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-scripts\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.235888 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-public-tls-certs\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.235934 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-config-data\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.235983 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-fernet-keys\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.236038 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-internal-tls-certs\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.236124 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-combined-ca-bundle\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.347441 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-config-data\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.348082 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-fernet-keys\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.351760 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-internal-tls-certs\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.351809 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-combined-ca-bundle\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.355232 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-fernet-keys\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.355587 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-combined-ca-bundle\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.357675 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-config-data\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.358484 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-credential-keys\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.358531 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkxsj\" (UniqueName: \"kubernetes.io/projected/b60e7776-3e2a-4e08-900d-cd39a29a78bc-kube-api-access-rkxsj\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.358664 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-scripts\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.358693 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-public-tls-certs\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.362356 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-internal-tls-certs\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.367255 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-credential-keys\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.368076 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-scripts\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.368812 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b60e7776-3e2a-4e08-900d-cd39a29a78bc-public-tls-certs\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.380006 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkxsj\" (UniqueName: \"kubernetes.io/projected/b60e7776-3e2a-4e08-900d-cd39a29a78bc-kube-api-access-rkxsj\") pod \"keystone-749f685d77-ggsln\" (UID: \"b60e7776-3e2a-4e08-900d-cd39a29a78bc\") " pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.577583 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.624951 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66955dfdb5-6j2wx" event={"ID":"90e1a395-ebc6-49ca-9924-c64283c12ec4","Type":"ContainerStarted","Data":"171f60709b450c4b056a3daad4b651d3e95b5d1e5702d55092b0d107469669dc"} Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.624996 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66955dfdb5-6j2wx" event={"ID":"90e1a395-ebc6-49ca-9924-c64283c12ec4","Type":"ContainerStarted","Data":"0e01f5820150bb847cd98736209f40a8cfae4c1d42fc832b6f738e299bc2db88"} Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.631429 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f7447dcd6-cpnn5" event={"ID":"947b32da-5664-42ff-a665-ac182dea1433","Type":"ContainerStarted","Data":"9288e4694d44c535f5db6a3588b5f57161da3432c7fdcf10f8abb56af7bd4be9"} Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.631469 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f7447dcd6-cpnn5" event={"ID":"947b32da-5664-42ff-a665-ac182dea1433","Type":"ContainerStarted","Data":"4555af75d6b8f403b02d065ff405189b215772f534b2956e72f7441d250ba2de"} Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.631481 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f7447dcd6-cpnn5" event={"ID":"947b32da-5664-42ff-a665-ac182dea1433","Type":"ContainerStarted","Data":"5d91235d937c34cd2f7190314bb2d67a56ac04c2b63da01a91c659eee91c9179"} Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.631843 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.631963 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.635158 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bbc5b86d6-8b672" event={"ID":"013566fd-5627-422a-809a-e81a8ec059d9","Type":"ContainerStarted","Data":"ca55616f2e5de805229eb50d3f643de3b33e4b53039ccf7569dc0337fc8e14a5"} Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.635187 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bbc5b86d6-8b672" event={"ID":"013566fd-5627-422a-809a-e81a8ec059d9","Type":"ContainerStarted","Data":"120df1c67b7935983b4052512da03cb57b70749fae0a4306db0f53d6bed8199c"} Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.645950 4792 generic.go:334] "Generic (PLEG): container finished" podID="82467164-5e77-4ea0-beee-b3a70126c075" containerID="610fd4808253a760f048f5c379d2e39f3eec919b6601aa983fb2a5f9ece83ce6" exitCode=0 Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.645996 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" event={"ID":"82467164-5e77-4ea0-beee-b3a70126c075","Type":"ContainerDied","Data":"610fd4808253a760f048f5c379d2e39f3eec919b6601aa983fb2a5f9ece83ce6"} Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.646019 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" event={"ID":"82467164-5e77-4ea0-beee-b3a70126c075","Type":"ContainerStarted","Data":"8b9ae7283e8be3bed9877439415e05f84a3d2c818de0aa317ad4a53c2c6bc4d6"} Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.663586 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7f7447dcd6-cpnn5" podStartSLOduration=3.663555251 podStartE2EDuration="3.663555251s" podCreationTimestamp="2026-03-01 09:28:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:28:33.647735923 +0000 UTC m=+1242.889615120" watchObservedRunningTime="2026-03-01 09:28:33.663555251 +0000 UTC m=+1242.905434448" Mar 01 09:28:33 crc kubenswrapper[4792]: I0301 09:28:33.900132 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-749f685d77-ggsln"] Mar 01 09:28:34 crc kubenswrapper[4792]: I0301 09:28:34.662379 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bbc5b86d6-8b672" event={"ID":"013566fd-5627-422a-809a-e81a8ec059d9","Type":"ContainerStarted","Data":"2f7d0d1b6918c9e962de8e492d201469a86475bd623f4086bd8a9e1f30a74d63"} Mar 01 09:28:34 crc kubenswrapper[4792]: I0301 09:28:34.662735 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:34 crc kubenswrapper[4792]: I0301 09:28:34.664345 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-749f685d77-ggsln" event={"ID":"b60e7776-3e2a-4e08-900d-cd39a29a78bc","Type":"ContainerStarted","Data":"46610eac821e49ac438645f9b9439a857b8afb4d70878718f9ff80be2c356756"} Mar 01 09:28:34 crc kubenswrapper[4792]: I0301 09:28:34.667302 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" event={"ID":"82467164-5e77-4ea0-beee-b3a70126c075","Type":"ContainerStarted","Data":"4fab450c58bb439c06a393afb4862347d838c516e4e1d271f6a55936b2e24c70"} Mar 01 09:28:34 crc kubenswrapper[4792]: I0301 09:28:34.667444 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:34 crc kubenswrapper[4792]: I0301 09:28:34.669153 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66955dfdb5-6j2wx" event={"ID":"90e1a395-ebc6-49ca-9924-c64283c12ec4","Type":"ContainerStarted","Data":"5ed4ad048c8c293d006aad77d5956e915974d390f9663c49f5f3c523ced04257"} Mar 01 09:28:34 crc kubenswrapper[4792]: I0301 09:28:34.669696 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:28:34 crc kubenswrapper[4792]: I0301 09:28:34.690364 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7bbc5b86d6-8b672" podStartSLOduration=7.690349465 podStartE2EDuration="7.690349465s" podCreationTimestamp="2026-03-01 09:28:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:28:34.685722011 +0000 UTC m=+1243.927601208" watchObservedRunningTime="2026-03-01 09:28:34.690349465 +0000 UTC m=+1243.932228662" Mar 01 09:28:34 crc kubenswrapper[4792]: I0301 09:28:34.753689 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-66955dfdb5-6j2wx" podStartSLOduration=5.753672009 podStartE2EDuration="5.753672009s" podCreationTimestamp="2026-03-01 09:28:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:28:34.723717584 +0000 UTC m=+1243.965596781" watchObservedRunningTime="2026-03-01 09:28:34.753672009 +0000 UTC m=+1243.995551196" Mar 01 09:28:35 crc kubenswrapper[4792]: I0301 09:28:35.431767 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" podStartSLOduration=8.431728363 podStartE2EDuration="8.431728363s" podCreationTimestamp="2026-03-01 09:28:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:28:34.755972385 +0000 UTC m=+1243.997851582" watchObservedRunningTime="2026-03-01 09:28:35.431728363 +0000 UTC m=+1244.673607560" Mar 01 09:28:35 crc kubenswrapper[4792]: I0301 09:28:35.680449 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-749f685d77-ggsln" event={"ID":"b60e7776-3e2a-4e08-900d-cd39a29a78bc","Type":"ContainerStarted","Data":"9d05507d7c784c54e92f11b486349b46e1781ea9f40abf0c6c7a47b5a0f5a762"} Mar 01 09:28:35 crc kubenswrapper[4792]: I0301 09:28:35.680617 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:28:35 crc kubenswrapper[4792]: I0301 09:28:35.708914 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-749f685d77-ggsln" podStartSLOduration=2.708879435 podStartE2EDuration="2.708879435s" podCreationTimestamp="2026-03-01 09:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:28:35.70013982 +0000 UTC m=+1244.942019017" watchObservedRunningTime="2026-03-01 09:28:35.708879435 +0000 UTC m=+1244.950758632" Mar 01 09:28:38 crc kubenswrapper[4792]: I0301 09:28:38.214115 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:28:38 crc kubenswrapper[4792]: I0301 09:28:38.277270 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589bbb667-6gxlc"] Mar 01 09:28:38 crc kubenswrapper[4792]: I0301 09:28:38.277514 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-589bbb667-6gxlc" podUID="4e1a508b-9db4-414a-b06d-2f01a2c132a1" containerName="dnsmasq-dns" containerID="cri-o://245d762d71bcda288abc788f3327cb41b0f2c4d918af85b1ae35b3d1615c472c" gracePeriod=10 Mar 01 09:28:38 crc kubenswrapper[4792]: I0301 09:28:38.343342 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-589bbb667-6gxlc" podUID="4e1a508b-9db4-414a-b06d-2f01a2c132a1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.140:5353: connect: connection refused" Mar 01 09:28:38 crc kubenswrapper[4792]: I0301 09:28:38.722258 4792 generic.go:334] "Generic (PLEG): container finished" podID="4e1a508b-9db4-414a-b06d-2f01a2c132a1" containerID="245d762d71bcda288abc788f3327cb41b0f2c4d918af85b1ae35b3d1615c472c" exitCode=0 Mar 01 09:28:38 crc kubenswrapper[4792]: I0301 09:28:38.722300 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589bbb667-6gxlc" event={"ID":"4e1a508b-9db4-414a-b06d-2f01a2c132a1","Type":"ContainerDied","Data":"245d762d71bcda288abc788f3327cb41b0f2c4d918af85b1ae35b3d1615c472c"} Mar 01 09:28:38 crc kubenswrapper[4792]: E0301 09:28:38.954697 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddabb3d2e_57fa_4ad3_9f3b_b85e0b670650.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddabb3d2e_57fa_4ad3_9f3b_b85e0b670650.slice/crio-212a58af65a44bd380179af8061fb1773b3c54204d964227d1d4aa6b00521785\": RecentStats: unable to find data in memory cache]" Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.705503 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.814832 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589bbb667-6gxlc" event={"ID":"4e1a508b-9db4-414a-b06d-2f01a2c132a1","Type":"ContainerDied","Data":"0fc29a339e8c09faaadb240388644d964ff0e254b0fbfad00516d578fb8bff1b"} Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.815189 4792 scope.go:117] "RemoveContainer" containerID="245d762d71bcda288abc788f3327cb41b0f2c4d918af85b1ae35b3d1615c472c" Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.815627 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589bbb667-6gxlc" Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.867649 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-ovsdbserver-sb\") pod \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.867793 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-config\") pod \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.867819 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cv7bl\" (UniqueName: \"kubernetes.io/projected/4e1a508b-9db4-414a-b06d-2f01a2c132a1-kube-api-access-cv7bl\") pod \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.867858 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-ovsdbserver-nb\") pod \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.867877 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-dns-svc\") pod \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\" (UID: \"4e1a508b-9db4-414a-b06d-2f01a2c132a1\") " Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.891581 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e1a508b-9db4-414a-b06d-2f01a2c132a1-kube-api-access-cv7bl" (OuterVolumeSpecName: "kube-api-access-cv7bl") pod "4e1a508b-9db4-414a-b06d-2f01a2c132a1" (UID: "4e1a508b-9db4-414a-b06d-2f01a2c132a1"). InnerVolumeSpecName "kube-api-access-cv7bl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.902412 4792 scope.go:117] "RemoveContainer" containerID="70a638170c1ac600ae163392b053537afb8b6aa9687b87f677426f4f023db168" Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.923819 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-config" (OuterVolumeSpecName: "config") pod "4e1a508b-9db4-414a-b06d-2f01a2c132a1" (UID: "4e1a508b-9db4-414a-b06d-2f01a2c132a1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.936086 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4e1a508b-9db4-414a-b06d-2f01a2c132a1" (UID: "4e1a508b-9db4-414a-b06d-2f01a2c132a1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.938394 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4e1a508b-9db4-414a-b06d-2f01a2c132a1" (UID: "4e1a508b-9db4-414a-b06d-2f01a2c132a1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.941477 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4e1a508b-9db4-414a-b06d-2f01a2c132a1" (UID: "4e1a508b-9db4-414a-b06d-2f01a2c132a1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.970278 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.970400 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.970467 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cv7bl\" (UniqueName: \"kubernetes.io/projected/4e1a508b-9db4-414a-b06d-2f01a2c132a1-kube-api-access-cv7bl\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.970538 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:42 crc kubenswrapper[4792]: I0301 09:28:42.970601 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e1a508b-9db4-414a-b06d-2f01a2c132a1-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:43 crc kubenswrapper[4792]: I0301 09:28:43.175619 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589bbb667-6gxlc"] Mar 01 09:28:43 crc kubenswrapper[4792]: I0301 09:28:43.181968 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-589bbb667-6gxlc"] Mar 01 09:28:43 crc kubenswrapper[4792]: I0301 09:28:43.423976 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e1a508b-9db4-414a-b06d-2f01a2c132a1" path="/var/lib/kubelet/pods/4e1a508b-9db4-414a-b06d-2f01a2c132a1/volumes" Mar 01 09:28:43 crc kubenswrapper[4792]: I0301 09:28:43.823086 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-bxx5d" event={"ID":"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd","Type":"ContainerStarted","Data":"eb8be93bccd98a25e35b0b04ca4b752b359f9eaa5d34f76412ea01464dd8c3f9"} Mar 01 09:28:43 crc kubenswrapper[4792]: I0301 09:28:43.825059 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gsxqb" event={"ID":"737aa0a0-6e53-451e-9d5f-2deada87b5b4","Type":"ContainerStarted","Data":"5753e582a896b76584d26c8b6fbaf1b0c86841fe9960e87056d0ee4ab735dcee"} Mar 01 09:28:43 crc kubenswrapper[4792]: I0301 09:28:43.827322 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd","Type":"ContainerStarted","Data":"131d1c48281fda07ee509861ecd19ed50a8dc2c67c40d98a6892403dc5e2415a"} Mar 01 09:28:43 crc kubenswrapper[4792]: I0301 09:28:43.827431 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerName="ceilometer-central-agent" containerID="cri-o://4fd86a535781157d736a326c5d3973270ef9e75f90ca5c7a184728477646f601" gracePeriod=30 Mar 01 09:28:43 crc kubenswrapper[4792]: I0301 09:28:43.827688 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 01 09:28:43 crc kubenswrapper[4792]: I0301 09:28:43.827733 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerName="proxy-httpd" containerID="cri-o://131d1c48281fda07ee509861ecd19ed50a8dc2c67c40d98a6892403dc5e2415a" gracePeriod=30 Mar 01 09:28:43 crc kubenswrapper[4792]: I0301 09:28:43.827774 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerName="sg-core" containerID="cri-o://10b8b301db81f96185e1ca933d6d257371da7cfb2a533a1434c80a6ff2a5895f" gracePeriod=30 Mar 01 09:28:43 crc kubenswrapper[4792]: I0301 09:28:43.827813 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerName="ceilometer-notification-agent" containerID="cri-o://ac3d91f96c6efaf7baa089ecdf84d5d3fe923f61545b960c7a1aa1d77e8db2e5" gracePeriod=30 Mar 01 09:28:43 crc kubenswrapper[4792]: I0301 09:28:43.871959 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-bxx5d" podStartSLOduration=3.222251536 podStartE2EDuration="46.871942903s" podCreationTimestamp="2026-03-01 09:27:57 +0000 UTC" firstStartedPulling="2026-03-01 09:27:59.114605319 +0000 UTC m=+1208.356484516" lastFinishedPulling="2026-03-01 09:28:42.764296686 +0000 UTC m=+1252.006175883" observedRunningTime="2026-03-01 09:28:43.850027875 +0000 UTC m=+1253.091907072" watchObservedRunningTime="2026-03-01 09:28:43.871942903 +0000 UTC m=+1253.113822100" Mar 01 09:28:43 crc kubenswrapper[4792]: I0301 09:28:43.900791 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-gsxqb" podStartSLOduration=2.934817501 podStartE2EDuration="46.900776011s" podCreationTimestamp="2026-03-01 09:27:57 +0000 UTC" firstStartedPulling="2026-03-01 09:27:58.769330444 +0000 UTC m=+1208.011209641" lastFinishedPulling="2026-03-01 09:28:42.735288954 +0000 UTC m=+1251.977168151" observedRunningTime="2026-03-01 09:28:43.87304073 +0000 UTC m=+1253.114919927" watchObservedRunningTime="2026-03-01 09:28:43.900776011 +0000 UTC m=+1253.142655208" Mar 01 09:28:44 crc kubenswrapper[4792]: I0301 09:28:44.837834 4792 generic.go:334] "Generic (PLEG): container finished" podID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerID="131d1c48281fda07ee509861ecd19ed50a8dc2c67c40d98a6892403dc5e2415a" exitCode=0 Mar 01 09:28:44 crc kubenswrapper[4792]: I0301 09:28:44.838153 4792 generic.go:334] "Generic (PLEG): container finished" podID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerID="10b8b301db81f96185e1ca933d6d257371da7cfb2a533a1434c80a6ff2a5895f" exitCode=2 Mar 01 09:28:44 crc kubenswrapper[4792]: I0301 09:28:44.838163 4792 generic.go:334] "Generic (PLEG): container finished" podID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerID="4fd86a535781157d736a326c5d3973270ef9e75f90ca5c7a184728477646f601" exitCode=0 Mar 01 09:28:44 crc kubenswrapper[4792]: I0301 09:28:44.837928 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd","Type":"ContainerDied","Data":"131d1c48281fda07ee509861ecd19ed50a8dc2c67c40d98a6892403dc5e2415a"} Mar 01 09:28:44 crc kubenswrapper[4792]: I0301 09:28:44.838193 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd","Type":"ContainerDied","Data":"10b8b301db81f96185e1ca933d6d257371da7cfb2a533a1434c80a6ff2a5895f"} Mar 01 09:28:44 crc kubenswrapper[4792]: I0301 09:28:44.838205 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd","Type":"ContainerDied","Data":"4fd86a535781157d736a326c5d3973270ef9e75f90ca5c7a184728477646f601"} Mar 01 09:28:46 crc kubenswrapper[4792]: I0301 09:28:46.857703 4792 generic.go:334] "Generic (PLEG): container finished" podID="9e6bad7a-881b-4ef4-9916-f447e2fc1ffd" containerID="eb8be93bccd98a25e35b0b04ca4b752b359f9eaa5d34f76412ea01464dd8c3f9" exitCode=0 Mar 01 09:28:46 crc kubenswrapper[4792]: I0301 09:28:46.857850 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-bxx5d" event={"ID":"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd","Type":"ContainerDied","Data":"eb8be93bccd98a25e35b0b04ca4b752b359f9eaa5d34f76412ea01464dd8c3f9"} Mar 01 09:28:46 crc kubenswrapper[4792]: I0301 09:28:46.865197 4792 generic.go:334] "Generic (PLEG): container finished" podID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerID="ac3d91f96c6efaf7baa089ecdf84d5d3fe923f61545b960c7a1aa1d77e8db2e5" exitCode=0 Mar 01 09:28:46 crc kubenswrapper[4792]: I0301 09:28:46.865289 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd","Type":"ContainerDied","Data":"ac3d91f96c6efaf7baa089ecdf84d5d3fe923f61545b960c7a1aa1d77e8db2e5"} Mar 01 09:28:46 crc kubenswrapper[4792]: I0301 09:28:46.865355 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd","Type":"ContainerDied","Data":"4a875a1ec016948e8ea916192a157e8f35d24195db5c29c64d33740934a209c2"} Mar 01 09:28:46 crc kubenswrapper[4792]: I0301 09:28:46.865372 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a875a1ec016948e8ea916192a157e8f35d24195db5c29c64d33740934a209c2" Mar 01 09:28:46 crc kubenswrapper[4792]: I0301 09:28:46.884753 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=5.871369731 podStartE2EDuration="49.884733645s" podCreationTimestamp="2026-03-01 09:27:57 +0000 UTC" firstStartedPulling="2026-03-01 09:27:58.755777331 +0000 UTC m=+1207.997656528" lastFinishedPulling="2026-03-01 09:28:42.769141245 +0000 UTC m=+1252.011020442" observedRunningTime="2026-03-01 09:28:43.904948334 +0000 UTC m=+1253.146827531" watchObservedRunningTime="2026-03-01 09:28:46.884733645 +0000 UTC m=+1256.126612842" Mar 01 09:28:46 crc kubenswrapper[4792]: I0301 09:28:46.893675 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.051802 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-run-httpd\") pod \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.052386 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-log-httpd\") pod \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.052517 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" (UID: "f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.052546 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-config-data\") pod \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.052716 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" (UID: "f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.053827 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-scripts\") pod \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.053934 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-sg-core-conf-yaml\") pod \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.053986 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgczn\" (UniqueName: \"kubernetes.io/projected/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-kube-api-access-zgczn\") pod \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.054084 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-combined-ca-bundle\") pod \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\" (UID: \"f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd\") " Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.055253 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.055281 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.060565 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-kube-api-access-zgczn" (OuterVolumeSpecName: "kube-api-access-zgczn") pod "f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" (UID: "f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd"). InnerVolumeSpecName "kube-api-access-zgczn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.060667 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-scripts" (OuterVolumeSpecName: "scripts") pod "f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" (UID: "f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.086155 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" (UID: "f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.141823 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" (UID: "f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.156852 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.156891 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.156905 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgczn\" (UniqueName: \"kubernetes.io/projected/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-kube-api-access-zgczn\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.156933 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.163003 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-config-data" (OuterVolumeSpecName: "config-data") pod "f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" (UID: "f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.258325 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.871392 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.904384 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.934377 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.989993 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:28:47 crc kubenswrapper[4792]: E0301 09:28:47.990465 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e1a508b-9db4-414a-b06d-2f01a2c132a1" containerName="dnsmasq-dns" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.990485 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e1a508b-9db4-414a-b06d-2f01a2c132a1" containerName="dnsmasq-dns" Mar 01 09:28:47 crc kubenswrapper[4792]: E0301 09:28:47.990506 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerName="sg-core" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.990512 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerName="sg-core" Mar 01 09:28:47 crc kubenswrapper[4792]: E0301 09:28:47.990528 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerName="proxy-httpd" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.990534 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerName="proxy-httpd" Mar 01 09:28:47 crc kubenswrapper[4792]: E0301 09:28:47.990547 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e1a508b-9db4-414a-b06d-2f01a2c132a1" containerName="init" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.990554 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e1a508b-9db4-414a-b06d-2f01a2c132a1" containerName="init" Mar 01 09:28:47 crc kubenswrapper[4792]: E0301 09:28:47.990564 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerName="ceilometer-central-agent" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.990570 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerName="ceilometer-central-agent" Mar 01 09:28:47 crc kubenswrapper[4792]: E0301 09:28:47.990581 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerName="ceilometer-notification-agent" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.990587 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerName="ceilometer-notification-agent" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.990753 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerName="ceilometer-notification-agent" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.990767 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e1a508b-9db4-414a-b06d-2f01a2c132a1" containerName="dnsmasq-dns" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.990775 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerName="proxy-httpd" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.990791 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerName="ceilometer-central-agent" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.990800 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" containerName="sg-core" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.992435 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:28:47 crc kubenswrapper[4792]: I0301 09:28:47.999934 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.000128 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.001770 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.156650 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:28:48 crc kubenswrapper[4792]: E0301 09:28:48.157242 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-2hgmj log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="e3d1f920-cbe0-4883-8030-826eab25677d" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.173356 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.173724 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3d1f920-cbe0-4883-8030-826eab25677d-run-httpd\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.173775 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3d1f920-cbe0-4883-8030-826eab25677d-log-httpd\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.173798 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.173828 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-config-data\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.173848 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hgmj\" (UniqueName: \"kubernetes.io/projected/e3d1f920-cbe0-4883-8030-826eab25677d-kube-api-access-2hgmj\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.173880 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-scripts\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.275135 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3d1f920-cbe0-4883-8030-826eab25677d-log-httpd\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.275192 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.275228 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-config-data\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.275257 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hgmj\" (UniqueName: \"kubernetes.io/projected/e3d1f920-cbe0-4883-8030-826eab25677d-kube-api-access-2hgmj\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.275300 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-scripts\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.275329 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.275366 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3d1f920-cbe0-4883-8030-826eab25677d-run-httpd\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.275727 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3d1f920-cbe0-4883-8030-826eab25677d-run-httpd\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.275987 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3d1f920-cbe0-4883-8030-826eab25677d-log-httpd\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.291464 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-config-data\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.292087 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.296581 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hgmj\" (UniqueName: \"kubernetes.io/projected/e3d1f920-cbe0-4883-8030-826eab25677d-kube-api-access-2hgmj\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.299540 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-scripts\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.303700 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.395280 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-bxx5d" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.477782 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzxv7\" (UniqueName: \"kubernetes.io/projected/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd-kube-api-access-pzxv7\") pod \"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd\" (UID: \"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd\") " Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.478175 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd-db-sync-config-data\") pod \"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd\" (UID: \"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd\") " Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.478280 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd-combined-ca-bundle\") pod \"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd\" (UID: \"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd\") " Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.481826 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9e6bad7a-881b-4ef4-9916-f447e2fc1ffd" (UID: "9e6bad7a-881b-4ef4-9916-f447e2fc1ffd"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.482734 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd-kube-api-access-pzxv7" (OuterVolumeSpecName: "kube-api-access-pzxv7") pod "9e6bad7a-881b-4ef4-9916-f447e2fc1ffd" (UID: "9e6bad7a-881b-4ef4-9916-f447e2fc1ffd"). InnerVolumeSpecName "kube-api-access-pzxv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.503822 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e6bad7a-881b-4ef4-9916-f447e2fc1ffd" (UID: "9e6bad7a-881b-4ef4-9916-f447e2fc1ffd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.583556 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzxv7\" (UniqueName: \"kubernetes.io/projected/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd-kube-api-access-pzxv7\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.583620 4792 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.583636 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.885339 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.885394 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-bxx5d" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.885325 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-bxx5d" event={"ID":"9e6bad7a-881b-4ef4-9916-f447e2fc1ffd","Type":"ContainerDied","Data":"5a8a6506253c42d0a8617675f0f4091a77e441fa914d2388057587c940f25850"} Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.885569 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a8a6506253c42d0a8617675f0f4091a77e441fa914d2388057587c940f25850" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.900436 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.988757 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3d1f920-cbe0-4883-8030-826eab25677d-log-httpd\") pod \"e3d1f920-cbe0-4883-8030-826eab25677d\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.989423 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3d1f920-cbe0-4883-8030-826eab25677d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e3d1f920-cbe0-4883-8030-826eab25677d" (UID: "e3d1f920-cbe0-4883-8030-826eab25677d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.989489 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-combined-ca-bundle\") pod \"e3d1f920-cbe0-4883-8030-826eab25677d\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.989560 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-config-data\") pod \"e3d1f920-cbe0-4883-8030-826eab25677d\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.990077 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hgmj\" (UniqueName: \"kubernetes.io/projected/e3d1f920-cbe0-4883-8030-826eab25677d-kube-api-access-2hgmj\") pod \"e3d1f920-cbe0-4883-8030-826eab25677d\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.990199 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-scripts\") pod \"e3d1f920-cbe0-4883-8030-826eab25677d\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.990220 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-sg-core-conf-yaml\") pod \"e3d1f920-cbe0-4883-8030-826eab25677d\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.990280 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3d1f920-cbe0-4883-8030-826eab25677d-run-httpd\") pod \"e3d1f920-cbe0-4883-8030-826eab25677d\" (UID: \"e3d1f920-cbe0-4883-8030-826eab25677d\") " Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.990646 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3d1f920-cbe0-4883-8030-826eab25677d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.991172 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3d1f920-cbe0-4883-8030-826eab25677d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e3d1f920-cbe0-4883-8030-826eab25677d" (UID: "e3d1f920-cbe0-4883-8030-826eab25677d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:28:48 crc kubenswrapper[4792]: I0301 09:28:48.996127 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3d1f920-cbe0-4883-8030-826eab25677d" (UID: "e3d1f920-cbe0-4883-8030-826eab25677d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:48.999733 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-scripts" (OuterVolumeSpecName: "scripts") pod "e3d1f920-cbe0-4883-8030-826eab25677d" (UID: "e3d1f920-cbe0-4883-8030-826eab25677d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.000324 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e3d1f920-cbe0-4883-8030-826eab25677d" (UID: "e3d1f920-cbe0-4883-8030-826eab25677d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.000649 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-config-data" (OuterVolumeSpecName: "config-data") pod "e3d1f920-cbe0-4883-8030-826eab25677d" (UID: "e3d1f920-cbe0-4883-8030-826eab25677d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.004526 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3d1f920-cbe0-4883-8030-826eab25677d-kube-api-access-2hgmj" (OuterVolumeSpecName: "kube-api-access-2hgmj") pod "e3d1f920-cbe0-4883-8030-826eab25677d" (UID: "e3d1f920-cbe0-4883-8030-826eab25677d"). InnerVolumeSpecName "kube-api-access-2hgmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.094906 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hgmj\" (UniqueName: \"kubernetes.io/projected/e3d1f920-cbe0-4883-8030-826eab25677d-kube-api-access-2hgmj\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.100630 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.100643 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.100654 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3d1f920-cbe0-4883-8030-826eab25677d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.100666 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.100678 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d1f920-cbe0-4883-8030-826eab25677d-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:49 crc kubenswrapper[4792]: E0301 09:28:49.206116 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddabb3d2e_57fa_4ad3_9f3b_b85e0b670650.slice/crio-212a58af65a44bd380179af8061fb1773b3c54204d964227d1d4aa6b00521785\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddabb3d2e_57fa_4ad3_9f3b_b85e0b670650.slice\": RecentStats: unable to find data in memory cache]" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.238337 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-65f4d58895-tvn59"] Mar 01 09:28:49 crc kubenswrapper[4792]: E0301 09:28:49.238761 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e6bad7a-881b-4ef4-9916-f447e2fc1ffd" containerName="barbican-db-sync" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.238780 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e6bad7a-881b-4ef4-9916-f447e2fc1ffd" containerName="barbican-db-sync" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.246403 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e6bad7a-881b-4ef4-9916-f447e2fc1ffd" containerName="barbican-db-sync" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.247408 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-65f4d58895-tvn59" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.252371 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-wjs57" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.266983 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.267080 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.290937 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-64f98fd86b-96l6n"] Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.293096 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.305800 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.311980 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-65f4d58895-tvn59"] Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.342440 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-64f98fd86b-96l6n"] Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.391860 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-775688cbd9-h9lg8"] Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.393731 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.407407 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24mz2\" (UniqueName: \"kubernetes.io/projected/d30c642c-b4ae-495a-8acd-cc8be4a0f412-kube-api-access-24mz2\") pod \"barbican-keystone-listener-64f98fd86b-96l6n\" (UID: \"d30c642c-b4ae-495a-8acd-cc8be4a0f412\") " pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.407459 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26fbd30a-a485-4463-9aac-bb695c43e9e3-logs\") pod \"barbican-worker-65f4d58895-tvn59\" (UID: \"26fbd30a-a485-4463-9aac-bb695c43e9e3\") " pod="openstack/barbican-worker-65f4d58895-tvn59" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.407482 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s92q\" (UniqueName: \"kubernetes.io/projected/26fbd30a-a485-4463-9aac-bb695c43e9e3-kube-api-access-7s92q\") pod \"barbican-worker-65f4d58895-tvn59\" (UID: \"26fbd30a-a485-4463-9aac-bb695c43e9e3\") " pod="openstack/barbican-worker-65f4d58895-tvn59" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.407548 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26fbd30a-a485-4463-9aac-bb695c43e9e3-config-data-custom\") pod \"barbican-worker-65f4d58895-tvn59\" (UID: \"26fbd30a-a485-4463-9aac-bb695c43e9e3\") " pod="openstack/barbican-worker-65f4d58895-tvn59" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.407573 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d30c642c-b4ae-495a-8acd-cc8be4a0f412-combined-ca-bundle\") pod \"barbican-keystone-listener-64f98fd86b-96l6n\" (UID: \"d30c642c-b4ae-495a-8acd-cc8be4a0f412\") " pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.407641 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d30c642c-b4ae-495a-8acd-cc8be4a0f412-logs\") pod \"barbican-keystone-listener-64f98fd86b-96l6n\" (UID: \"d30c642c-b4ae-495a-8acd-cc8be4a0f412\") " pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.407681 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d30c642c-b4ae-495a-8acd-cc8be4a0f412-config-data\") pod \"barbican-keystone-listener-64f98fd86b-96l6n\" (UID: \"d30c642c-b4ae-495a-8acd-cc8be4a0f412\") " pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.407732 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26fbd30a-a485-4463-9aac-bb695c43e9e3-combined-ca-bundle\") pod \"barbican-worker-65f4d58895-tvn59\" (UID: \"26fbd30a-a485-4463-9aac-bb695c43e9e3\") " pod="openstack/barbican-worker-65f4d58895-tvn59" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.408247 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-775688cbd9-h9lg8"] Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.408427 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d30c642c-b4ae-495a-8acd-cc8be4a0f412-config-data-custom\") pod \"barbican-keystone-listener-64f98fd86b-96l6n\" (UID: \"d30c642c-b4ae-495a-8acd-cc8be4a0f412\") " pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.408481 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26fbd30a-a485-4463-9aac-bb695c43e9e3-config-data\") pod \"barbican-worker-65f4d58895-tvn59\" (UID: \"26fbd30a-a485-4463-9aac-bb695c43e9e3\") " pod="openstack/barbican-worker-65f4d58895-tvn59" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.440245 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd" path="/var/lib/kubelet/pods/f1a0fe66-bdbf-4a52-a3b7-ff3d1898ffdd/volumes" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.510307 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-ovsdbserver-sb\") pod \"dnsmasq-dns-775688cbd9-h9lg8\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.510352 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d30c642c-b4ae-495a-8acd-cc8be4a0f412-combined-ca-bundle\") pod \"barbican-keystone-listener-64f98fd86b-96l6n\" (UID: \"d30c642c-b4ae-495a-8acd-cc8be4a0f412\") " pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.510391 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d30c642c-b4ae-495a-8acd-cc8be4a0f412-logs\") pod \"barbican-keystone-listener-64f98fd86b-96l6n\" (UID: \"d30c642c-b4ae-495a-8acd-cc8be4a0f412\") " pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.510429 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d30c642c-b4ae-495a-8acd-cc8be4a0f412-config-data\") pod \"barbican-keystone-listener-64f98fd86b-96l6n\" (UID: \"d30c642c-b4ae-495a-8acd-cc8be4a0f412\") " pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.510453 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26fbd30a-a485-4463-9aac-bb695c43e9e3-combined-ca-bundle\") pod \"barbican-worker-65f4d58895-tvn59\" (UID: \"26fbd30a-a485-4463-9aac-bb695c43e9e3\") " pod="openstack/barbican-worker-65f4d58895-tvn59" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.510468 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d30c642c-b4ae-495a-8acd-cc8be4a0f412-config-data-custom\") pod \"barbican-keystone-listener-64f98fd86b-96l6n\" (UID: \"d30c642c-b4ae-495a-8acd-cc8be4a0f412\") " pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.510488 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26fbd30a-a485-4463-9aac-bb695c43e9e3-config-data\") pod \"barbican-worker-65f4d58895-tvn59\" (UID: \"26fbd30a-a485-4463-9aac-bb695c43e9e3\") " pod="openstack/barbican-worker-65f4d58895-tvn59" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.510510 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-config\") pod \"dnsmasq-dns-775688cbd9-h9lg8\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.510544 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24mz2\" (UniqueName: \"kubernetes.io/projected/d30c642c-b4ae-495a-8acd-cc8be4a0f412-kube-api-access-24mz2\") pod \"barbican-keystone-listener-64f98fd86b-96l6n\" (UID: \"d30c642c-b4ae-495a-8acd-cc8be4a0f412\") " pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.510568 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-dns-svc\") pod \"dnsmasq-dns-775688cbd9-h9lg8\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.510588 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26fbd30a-a485-4463-9aac-bb695c43e9e3-logs\") pod \"barbican-worker-65f4d58895-tvn59\" (UID: \"26fbd30a-a485-4463-9aac-bb695c43e9e3\") " pod="openstack/barbican-worker-65f4d58895-tvn59" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.510605 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s92q\" (UniqueName: \"kubernetes.io/projected/26fbd30a-a485-4463-9aac-bb695c43e9e3-kube-api-access-7s92q\") pod \"barbican-worker-65f4d58895-tvn59\" (UID: \"26fbd30a-a485-4463-9aac-bb695c43e9e3\") " pod="openstack/barbican-worker-65f4d58895-tvn59" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.510644 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcf7g\" (UniqueName: \"kubernetes.io/projected/c418cc81-d55e-4678-8aad-caa1573d366a-kube-api-access-mcf7g\") pod \"dnsmasq-dns-775688cbd9-h9lg8\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.510662 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-ovsdbserver-nb\") pod \"dnsmasq-dns-775688cbd9-h9lg8\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.510694 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26fbd30a-a485-4463-9aac-bb695c43e9e3-config-data-custom\") pod \"barbican-worker-65f4d58895-tvn59\" (UID: \"26fbd30a-a485-4463-9aac-bb695c43e9e3\") " pod="openstack/barbican-worker-65f4d58895-tvn59" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.511464 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d30c642c-b4ae-495a-8acd-cc8be4a0f412-logs\") pod \"barbican-keystone-listener-64f98fd86b-96l6n\" (UID: \"d30c642c-b4ae-495a-8acd-cc8be4a0f412\") " pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.512526 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26fbd30a-a485-4463-9aac-bb695c43e9e3-logs\") pod \"barbican-worker-65f4d58895-tvn59\" (UID: \"26fbd30a-a485-4463-9aac-bb695c43e9e3\") " pod="openstack/barbican-worker-65f4d58895-tvn59" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.520828 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d30c642c-b4ae-495a-8acd-cc8be4a0f412-combined-ca-bundle\") pod \"barbican-keystone-listener-64f98fd86b-96l6n\" (UID: \"d30c642c-b4ae-495a-8acd-cc8be4a0f412\") " pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.520859 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26fbd30a-a485-4463-9aac-bb695c43e9e3-combined-ca-bundle\") pod \"barbican-worker-65f4d58895-tvn59\" (UID: \"26fbd30a-a485-4463-9aac-bb695c43e9e3\") " pod="openstack/barbican-worker-65f4d58895-tvn59" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.521930 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26fbd30a-a485-4463-9aac-bb695c43e9e3-config-data\") pod \"barbican-worker-65f4d58895-tvn59\" (UID: \"26fbd30a-a485-4463-9aac-bb695c43e9e3\") " pod="openstack/barbican-worker-65f4d58895-tvn59" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.523212 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d30c642c-b4ae-495a-8acd-cc8be4a0f412-config-data-custom\") pod \"barbican-keystone-listener-64f98fd86b-96l6n\" (UID: \"d30c642c-b4ae-495a-8acd-cc8be4a0f412\") " pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.526835 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d30c642c-b4ae-495a-8acd-cc8be4a0f412-config-data\") pod \"barbican-keystone-listener-64f98fd86b-96l6n\" (UID: \"d30c642c-b4ae-495a-8acd-cc8be4a0f412\") " pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.546461 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-686d9f9896-9zsh2"] Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.562232 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.562664 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26fbd30a-a485-4463-9aac-bb695c43e9e3-config-data-custom\") pod \"barbican-worker-65f4d58895-tvn59\" (UID: \"26fbd30a-a485-4463-9aac-bb695c43e9e3\") " pod="openstack/barbican-worker-65f4d58895-tvn59" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.567926 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-686d9f9896-9zsh2"] Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.613833 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.615009 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-config\") pod \"dnsmasq-dns-775688cbd9-h9lg8\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.615105 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-dns-svc\") pod \"dnsmasq-dns-775688cbd9-h9lg8\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.615159 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcf7g\" (UniqueName: \"kubernetes.io/projected/c418cc81-d55e-4678-8aad-caa1573d366a-kube-api-access-mcf7g\") pod \"dnsmasq-dns-775688cbd9-h9lg8\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.615187 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-ovsdbserver-nb\") pod \"dnsmasq-dns-775688cbd9-h9lg8\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.615254 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-ovsdbserver-sb\") pod \"dnsmasq-dns-775688cbd9-h9lg8\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.615923 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-dns-svc\") pod \"dnsmasq-dns-775688cbd9-h9lg8\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.616392 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-ovsdbserver-sb\") pod \"dnsmasq-dns-775688cbd9-h9lg8\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.616430 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-ovsdbserver-nb\") pod \"dnsmasq-dns-775688cbd9-h9lg8\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.616528 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-config\") pod \"dnsmasq-dns-775688cbd9-h9lg8\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.616795 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s92q\" (UniqueName: \"kubernetes.io/projected/26fbd30a-a485-4463-9aac-bb695c43e9e3-kube-api-access-7s92q\") pod \"barbican-worker-65f4d58895-tvn59\" (UID: \"26fbd30a-a485-4463-9aac-bb695c43e9e3\") " pod="openstack/barbican-worker-65f4d58895-tvn59" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.634595 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24mz2\" (UniqueName: \"kubernetes.io/projected/d30c642c-b4ae-495a-8acd-cc8be4a0f412-kube-api-access-24mz2\") pod \"barbican-keystone-listener-64f98fd86b-96l6n\" (UID: \"d30c642c-b4ae-495a-8acd-cc8be4a0f412\") " pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.650919 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcf7g\" (UniqueName: \"kubernetes.io/projected/c418cc81-d55e-4678-8aad-caa1573d366a-kube-api-access-mcf7g\") pod \"dnsmasq-dns-775688cbd9-h9lg8\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.716676 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4721d2a8-efb5-4fa3-9779-797448455198-config-data\") pod \"barbican-api-686d9f9896-9zsh2\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.716967 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4721d2a8-efb5-4fa3-9779-797448455198-combined-ca-bundle\") pod \"barbican-api-686d9f9896-9zsh2\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.717099 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4721d2a8-efb5-4fa3-9779-797448455198-logs\") pod \"barbican-api-686d9f9896-9zsh2\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.717198 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4721d2a8-efb5-4fa3-9779-797448455198-config-data-custom\") pod \"barbican-api-686d9f9896-9zsh2\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.717269 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58r6b\" (UniqueName: \"kubernetes.io/projected/4721d2a8-efb5-4fa3-9779-797448455198-kube-api-access-58r6b\") pod \"barbican-api-686d9f9896-9zsh2\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.727235 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.818528 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4721d2a8-efb5-4fa3-9779-797448455198-config-data-custom\") pod \"barbican-api-686d9f9896-9zsh2\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.818566 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58r6b\" (UniqueName: \"kubernetes.io/projected/4721d2a8-efb5-4fa3-9779-797448455198-kube-api-access-58r6b\") pod \"barbican-api-686d9f9896-9zsh2\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.818663 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4721d2a8-efb5-4fa3-9779-797448455198-config-data\") pod \"barbican-api-686d9f9896-9zsh2\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.818686 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4721d2a8-efb5-4fa3-9779-797448455198-combined-ca-bundle\") pod \"barbican-api-686d9f9896-9zsh2\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.818738 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4721d2a8-efb5-4fa3-9779-797448455198-logs\") pod \"barbican-api-686d9f9896-9zsh2\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.819120 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4721d2a8-efb5-4fa3-9779-797448455198-logs\") pod \"barbican-api-686d9f9896-9zsh2\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.824028 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4721d2a8-efb5-4fa3-9779-797448455198-config-data\") pod \"barbican-api-686d9f9896-9zsh2\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.825745 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4721d2a8-efb5-4fa3-9779-797448455198-config-data-custom\") pod \"barbican-api-686d9f9896-9zsh2\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.832872 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4721d2a8-efb5-4fa3-9779-797448455198-combined-ca-bundle\") pod \"barbican-api-686d9f9896-9zsh2\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.835188 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58r6b\" (UniqueName: \"kubernetes.io/projected/4721d2a8-efb5-4fa3-9779-797448455198-kube-api-access-58r6b\") pod \"barbican-api-686d9f9896-9zsh2\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.884237 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-65f4d58895-tvn59" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.897451 4792 generic.go:334] "Generic (PLEG): container finished" podID="737aa0a0-6e53-451e-9d5f-2deada87b5b4" containerID="5753e582a896b76584d26c8b6fbaf1b0c86841fe9960e87056d0ee4ab735dcee" exitCode=0 Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.897613 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.897509 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gsxqb" event={"ID":"737aa0a0-6e53-451e-9d5f-2deada87b5b4","Type":"ContainerDied","Data":"5753e582a896b76584d26c8b6fbaf1b0c86841fe9960e87056d0ee4ab735dcee"} Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.910189 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" Mar 01 09:28:49 crc kubenswrapper[4792]: I0301 09:28:49.986650 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.001347 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.016938 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.054037 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.057096 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.060382 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.060505 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.066552 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.136860 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpnn8\" (UniqueName: \"kubernetes.io/projected/73747536-7a61-4c63-87c7-9e4c72471fb1-kube-api-access-kpnn8\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.136963 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-config-data\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.136990 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73747536-7a61-4c63-87c7-9e4c72471fb1-log-httpd\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.137020 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73747536-7a61-4c63-87c7-9e4c72471fb1-run-httpd\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.137092 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-scripts\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.137130 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.137177 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.238505 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-scripts\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.238566 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.238614 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.238671 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpnn8\" (UniqueName: \"kubernetes.io/projected/73747536-7a61-4c63-87c7-9e4c72471fb1-kube-api-access-kpnn8\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.238707 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-config-data\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.238724 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73747536-7a61-4c63-87c7-9e4c72471fb1-log-httpd\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.238746 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73747536-7a61-4c63-87c7-9e4c72471fb1-run-httpd\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.239268 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73747536-7a61-4c63-87c7-9e4c72471fb1-run-httpd\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.246126 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73747536-7a61-4c63-87c7-9e4c72471fb1-log-httpd\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.249141 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.249815 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-scripts\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.250107 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-config-data\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.251126 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.264651 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpnn8\" (UniqueName: \"kubernetes.io/projected/73747536-7a61-4c63-87c7-9e4c72471fb1-kube-api-access-kpnn8\") pod \"ceilometer-0\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.352514 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-775688cbd9-h9lg8"] Mar 01 09:28:50 crc kubenswrapper[4792]: W0301 09:28:50.355357 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc418cc81_d55e_4678_8aad_caa1573d366a.slice/crio-bdfd8bffec09df88b8baa122fcaea45ce8205a2b6eb0d36b5009c9f753f43ecc WatchSource:0}: Error finding container bdfd8bffec09df88b8baa122fcaea45ce8205a2b6eb0d36b5009c9f753f43ecc: Status 404 returned error can't find the container with id bdfd8bffec09df88b8baa122fcaea45ce8205a2b6eb0d36b5009c9f753f43ecc Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.399187 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.480148 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-65f4d58895-tvn59"] Mar 01 09:28:50 crc kubenswrapper[4792]: W0301 09:28:50.498651 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26fbd30a_a485_4463_9aac_bb695c43e9e3.slice/crio-23f5431953315a66506197a7fed33c29c81e45caf335668eef31542ca24365d9 WatchSource:0}: Error finding container 23f5431953315a66506197a7fed33c29c81e45caf335668eef31542ca24365d9: Status 404 returned error can't find the container with id 23f5431953315a66506197a7fed33c29c81e45caf335668eef31542ca24365d9 Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.673169 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-64f98fd86b-96l6n"] Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.683651 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-686d9f9896-9zsh2"] Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.907527 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-686d9f9896-9zsh2" event={"ID":"4721d2a8-efb5-4fa3-9779-797448455198","Type":"ContainerStarted","Data":"128fecf3528e96b978b34551abfcb59be5f329262f2b7798f19013b2931fb841"} Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.910206 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" event={"ID":"d30c642c-b4ae-495a-8acd-cc8be4a0f412","Type":"ContainerStarted","Data":"c729a979d8c29010d2c655ac4de93ea2cacda6aa6f0a16f0f9981e0f5dbbf81d"} Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.912065 4792 generic.go:334] "Generic (PLEG): container finished" podID="c418cc81-d55e-4678-8aad-caa1573d366a" containerID="fb4bc5fb705e29cf6511138c6ac8f7d95127aa4a1c119e960a46f1f39b113863" exitCode=0 Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.912132 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" event={"ID":"c418cc81-d55e-4678-8aad-caa1573d366a","Type":"ContainerDied","Data":"fb4bc5fb705e29cf6511138c6ac8f7d95127aa4a1c119e960a46f1f39b113863"} Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.912156 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" event={"ID":"c418cc81-d55e-4678-8aad-caa1573d366a","Type":"ContainerStarted","Data":"bdfd8bffec09df88b8baa122fcaea45ce8205a2b6eb0d36b5009c9f753f43ecc"} Mar 01 09:28:50 crc kubenswrapper[4792]: I0301 09:28:50.918344 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65f4d58895-tvn59" event={"ID":"26fbd30a-a485-4463-9aac-bb695c43e9e3","Type":"ContainerStarted","Data":"23f5431953315a66506197a7fed33c29c81e45caf335668eef31542ca24365d9"} Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.098898 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.295716 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.371065 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-config-data\") pod \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.371143 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-scripts\") pod \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.371287 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84xmq\" (UniqueName: \"kubernetes.io/projected/737aa0a0-6e53-451e-9d5f-2deada87b5b4-kube-api-access-84xmq\") pod \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.371328 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-combined-ca-bundle\") pod \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.371366 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/737aa0a0-6e53-451e-9d5f-2deada87b5b4-etc-machine-id\") pod \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.371484 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-db-sync-config-data\") pod \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\" (UID: \"737aa0a0-6e53-451e-9d5f-2deada87b5b4\") " Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.374296 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/737aa0a0-6e53-451e-9d5f-2deada87b5b4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "737aa0a0-6e53-451e-9d5f-2deada87b5b4" (UID: "737aa0a0-6e53-451e-9d5f-2deada87b5b4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.376414 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/737aa0a0-6e53-451e-9d5f-2deada87b5b4-kube-api-access-84xmq" (OuterVolumeSpecName: "kube-api-access-84xmq") pod "737aa0a0-6e53-451e-9d5f-2deada87b5b4" (UID: "737aa0a0-6e53-451e-9d5f-2deada87b5b4"). InnerVolumeSpecName "kube-api-access-84xmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.377592 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "737aa0a0-6e53-451e-9d5f-2deada87b5b4" (UID: "737aa0a0-6e53-451e-9d5f-2deada87b5b4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.380672 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-scripts" (OuterVolumeSpecName: "scripts") pod "737aa0a0-6e53-451e-9d5f-2deada87b5b4" (UID: "737aa0a0-6e53-451e-9d5f-2deada87b5b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.432315 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "737aa0a0-6e53-451e-9d5f-2deada87b5b4" (UID: "737aa0a0-6e53-451e-9d5f-2deada87b5b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.439322 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3d1f920-cbe0-4883-8030-826eab25677d" path="/var/lib/kubelet/pods/e3d1f920-cbe0-4883-8030-826eab25677d/volumes" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.496704 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-config-data" (OuterVolumeSpecName: "config-data") pod "737aa0a0-6e53-451e-9d5f-2deada87b5b4" (UID: "737aa0a0-6e53-451e-9d5f-2deada87b5b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.497836 4792 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.497881 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.497894 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.497936 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84xmq\" (UniqueName: \"kubernetes.io/projected/737aa0a0-6e53-451e-9d5f-2deada87b5b4-kube-api-access-84xmq\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.497952 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737aa0a0-6e53-451e-9d5f-2deada87b5b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.497962 4792 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/737aa0a0-6e53-451e-9d5f-2deada87b5b4-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.938383 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gsxqb" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.938394 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gsxqb" event={"ID":"737aa0a0-6e53-451e-9d5f-2deada87b5b4","Type":"ContainerDied","Data":"dc7bf25ff6493b89f8d3d42eee96feaadc16025ec1b5d1ef3c591647a4fb7abf"} Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.938984 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc7bf25ff6493b89f8d3d42eee96feaadc16025ec1b5d1ef3c591647a4fb7abf" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.947103 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73747536-7a61-4c63-87c7-9e4c72471fb1","Type":"ContainerStarted","Data":"8f407749323a926af0db11e4921c8f80c0b44788d7a0172e925467426ce4a55c"} Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.951864 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-686d9f9896-9zsh2" event={"ID":"4721d2a8-efb5-4fa3-9779-797448455198","Type":"ContainerStarted","Data":"b0ab985ce4b6e50f0fbdce6821fe03c40fba76255764f0fad8a448b7b0c30943"} Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.952141 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-686d9f9896-9zsh2" event={"ID":"4721d2a8-efb5-4fa3-9779-797448455198","Type":"ContainerStarted","Data":"b2421256e4e0d8725502059d1b77eb867deef0339092d9a858244caa156e8a2b"} Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.952606 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.955143 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.961522 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" event={"ID":"c418cc81-d55e-4678-8aad-caa1573d366a","Type":"ContainerStarted","Data":"fe493b7561a052b1ecbcb59aceb3c59fff9d9037dafcf4b3ac23d8ab22a13b4e"} Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.962491 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:51 crc kubenswrapper[4792]: I0301 09:28:51.985194 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-686d9f9896-9zsh2" podStartSLOduration=2.985166409 podStartE2EDuration="2.985166409s" podCreationTimestamp="2026-03-01 09:28:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:28:51.97705835 +0000 UTC m=+1261.218937567" watchObservedRunningTime="2026-03-01 09:28:51.985166409 +0000 UTC m=+1261.227045606" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.013894 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" podStartSLOduration=3.013873264 podStartE2EDuration="3.013873264s" podCreationTimestamp="2026-03-01 09:28:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:28:52.005676223 +0000 UTC m=+1261.247555430" watchObservedRunningTime="2026-03-01 09:28:52.013873264 +0000 UTC m=+1261.255752461" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.371118 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 01 09:28:52 crc kubenswrapper[4792]: E0301 09:28:52.371746 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="737aa0a0-6e53-451e-9d5f-2deada87b5b4" containerName="cinder-db-sync" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.371758 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="737aa0a0-6e53-451e-9d5f-2deada87b5b4" containerName="cinder-db-sync" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.371954 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="737aa0a0-6e53-451e-9d5f-2deada87b5b4" containerName="cinder-db-sync" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.372804 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.384501 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.384719 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.388852 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.397714 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-7rpd7" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.421804 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.444469 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.444530 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-scripts\") pod \"cinder-scheduler-0\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.444557 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a7e11fe-898a-442c-b619-d7ccea385948-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.444593 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttz97\" (UniqueName: \"kubernetes.io/projected/5a7e11fe-898a-442c-b619-d7ccea385948-kube-api-access-ttz97\") pod \"cinder-scheduler-0\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.444612 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-config-data\") pod \"cinder-scheduler-0\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.444639 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.532095 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-775688cbd9-h9lg8"] Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.546191 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.546279 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-scripts\") pod \"cinder-scheduler-0\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.546304 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a7e11fe-898a-442c-b619-d7ccea385948-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.546353 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttz97\" (UniqueName: \"kubernetes.io/projected/5a7e11fe-898a-442c-b619-d7ccea385948-kube-api-access-ttz97\") pod \"cinder-scheduler-0\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.546368 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-config-data\") pod \"cinder-scheduler-0\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.546413 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.549235 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a7e11fe-898a-442c-b619-d7ccea385948-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.553521 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.565180 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.565842 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-config-data\") pod \"cinder-scheduler-0\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.573625 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-scripts\") pod \"cinder-scheduler-0\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.587089 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttz97\" (UniqueName: \"kubernetes.io/projected/5a7e11fe-898a-442c-b619-d7ccea385948-kube-api-access-ttz97\") pod \"cinder-scheduler-0\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.644164 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7675674687-rrbg6"] Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.645624 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.678595 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7675674687-rrbg6"] Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.733692 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.750049 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j7zl\" (UniqueName: \"kubernetes.io/projected/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-kube-api-access-2j7zl\") pod \"dnsmasq-dns-7675674687-rrbg6\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.750141 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-dns-svc\") pod \"dnsmasq-dns-7675674687-rrbg6\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.750198 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-ovsdbserver-sb\") pod \"dnsmasq-dns-7675674687-rrbg6\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.750220 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-ovsdbserver-nb\") pod \"dnsmasq-dns-7675674687-rrbg6\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.750244 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-config\") pod \"dnsmasq-dns-7675674687-rrbg6\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.807761 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.809152 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.817430 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.832202 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.852027 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-config\") pod \"dnsmasq-dns-7675674687-rrbg6\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.852123 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-scripts\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.852166 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j7zl\" (UniqueName: \"kubernetes.io/projected/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-kube-api-access-2j7zl\") pod \"dnsmasq-dns-7675674687-rrbg6\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.852215 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3261c1a4-1fc3-4584-a04b-a909176a21a7-logs\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.852249 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-config-data-custom\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.852281 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-config-data\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.852315 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3261c1a4-1fc3-4584-a04b-a909176a21a7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.852343 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4cnd\" (UniqueName: \"kubernetes.io/projected/3261c1a4-1fc3-4584-a04b-a909176a21a7-kube-api-access-r4cnd\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.852366 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-dns-svc\") pod \"dnsmasq-dns-7675674687-rrbg6\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.852456 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.852497 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-ovsdbserver-sb\") pod \"dnsmasq-dns-7675674687-rrbg6\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.852527 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-ovsdbserver-nb\") pod \"dnsmasq-dns-7675674687-rrbg6\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.853474 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-config\") pod \"dnsmasq-dns-7675674687-rrbg6\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.853525 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-ovsdbserver-nb\") pod \"dnsmasq-dns-7675674687-rrbg6\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.854880 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-dns-svc\") pod \"dnsmasq-dns-7675674687-rrbg6\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.855620 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-ovsdbserver-sb\") pod \"dnsmasq-dns-7675674687-rrbg6\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.905640 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j7zl\" (UniqueName: \"kubernetes.io/projected/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-kube-api-access-2j7zl\") pod \"dnsmasq-dns-7675674687-rrbg6\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.953762 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3261c1a4-1fc3-4584-a04b-a909176a21a7-logs\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.953815 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-config-data-custom\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.953846 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-config-data\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.953874 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3261c1a4-1fc3-4584-a04b-a909176a21a7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.953916 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4cnd\" (UniqueName: \"kubernetes.io/projected/3261c1a4-1fc3-4584-a04b-a909176a21a7-kube-api-access-r4cnd\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.953962 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.954014 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-scripts\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.954634 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3261c1a4-1fc3-4584-a04b-a909176a21a7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.955421 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3261c1a4-1fc3-4584-a04b-a909176a21a7-logs\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.958010 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-config-data-custom\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.958435 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-config-data\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.964351 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.966960 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.988057 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4cnd\" (UniqueName: \"kubernetes.io/projected/3261c1a4-1fc3-4584-a04b-a909176a21a7-kube-api-access-r4cnd\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:52 crc kubenswrapper[4792]: I0301 09:28:52.994097 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-scripts\") pod \"cinder-api-0\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " pod="openstack/cinder-api-0" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.122974 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7f86869f48-jg6nw"] Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.124824 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.129118 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.129178 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.129276 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.139870 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7f86869f48-jg6nw"] Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.267187 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq2tf\" (UniqueName: \"kubernetes.io/projected/6c97472a-b6b7-4fc4-b872-a318812f0999-kube-api-access-kq2tf\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.267257 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c97472a-b6b7-4fc4-b872-a318812f0999-config-data-custom\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.267291 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c97472a-b6b7-4fc4-b872-a318812f0999-combined-ca-bundle\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.267304 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c97472a-b6b7-4fc4-b872-a318812f0999-internal-tls-certs\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.267326 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c97472a-b6b7-4fc4-b872-a318812f0999-public-tls-certs\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.267356 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c97472a-b6b7-4fc4-b872-a318812f0999-logs\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.267382 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c97472a-b6b7-4fc4-b872-a318812f0999-config-data\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.368777 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c97472a-b6b7-4fc4-b872-a318812f0999-config-data-custom\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.368846 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c97472a-b6b7-4fc4-b872-a318812f0999-combined-ca-bundle\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.368869 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c97472a-b6b7-4fc4-b872-a318812f0999-internal-tls-certs\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.368896 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c97472a-b6b7-4fc4-b872-a318812f0999-public-tls-certs\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.368962 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c97472a-b6b7-4fc4-b872-a318812f0999-logs\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.369001 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c97472a-b6b7-4fc4-b872-a318812f0999-config-data\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.369080 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq2tf\" (UniqueName: \"kubernetes.io/projected/6c97472a-b6b7-4fc4-b872-a318812f0999-kube-api-access-kq2tf\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.369670 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c97472a-b6b7-4fc4-b872-a318812f0999-logs\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.373071 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c97472a-b6b7-4fc4-b872-a318812f0999-config-data-custom\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.373391 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c97472a-b6b7-4fc4-b872-a318812f0999-combined-ca-bundle\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.373668 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c97472a-b6b7-4fc4-b872-a318812f0999-public-tls-certs\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.374384 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c97472a-b6b7-4fc4-b872-a318812f0999-internal-tls-certs\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.389001 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c97472a-b6b7-4fc4-b872-a318812f0999-config-data\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.396650 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq2tf\" (UniqueName: \"kubernetes.io/projected/6c97472a-b6b7-4fc4-b872-a318812f0999-kube-api-access-kq2tf\") pod \"barbican-api-7f86869f48-jg6nw\" (UID: \"6c97472a-b6b7-4fc4-b872-a318812f0999\") " pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:53 crc kubenswrapper[4792]: I0301 09:28:53.452454 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:54 crc kubenswrapper[4792]: I0301 09:28:54.024327 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65f4d58895-tvn59" event={"ID":"26fbd30a-a485-4463-9aac-bb695c43e9e3","Type":"ContainerStarted","Data":"1b653d0f5781794221f385cc2010eeaf82b408f4b2eb6e3922c0102992bc8f4f"} Mar 01 09:28:54 crc kubenswrapper[4792]: I0301 09:28:54.024590 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" podUID="c418cc81-d55e-4678-8aad-caa1573d366a" containerName="dnsmasq-dns" containerID="cri-o://fe493b7561a052b1ecbcb59aceb3c59fff9d9037dafcf4b3ac23d8ab22a13b4e" gracePeriod=10 Mar 01 09:28:54 crc kubenswrapper[4792]: I0301 09:28:54.211404 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7675674687-rrbg6"] Mar 01 09:28:54 crc kubenswrapper[4792]: I0301 09:28:54.236024 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7f86869f48-jg6nw"] Mar 01 09:28:54 crc kubenswrapper[4792]: I0301 09:28:54.389026 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 01 09:28:54 crc kubenswrapper[4792]: W0301 09:28:54.414871 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3261c1a4_1fc3_4584_a04b_a909176a21a7.slice/crio-878d29e29b8947901ac3b55766798c44fc0595df994212304bdbb24caf194ff0 WatchSource:0}: Error finding container 878d29e29b8947901ac3b55766798c44fc0595df994212304bdbb24caf194ff0: Status 404 returned error can't find the container with id 878d29e29b8947901ac3b55766798c44fc0595df994212304bdbb24caf194ff0 Mar 01 09:28:54 crc kubenswrapper[4792]: I0301 09:28:54.437980 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.056478 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.057840 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" event={"ID":"d30c642c-b4ae-495a-8acd-cc8be4a0f412","Type":"ContainerStarted","Data":"e1d7268f85aab2519b72baa24d075192715425b6b0034f558542dc731de246d1"} Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.057898 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" event={"ID":"d30c642c-b4ae-495a-8acd-cc8be4a0f412","Type":"ContainerStarted","Data":"5eb771ebcc4fb088cc9d586d34d67eee927b15054e27473a11dea7307e6e8eda"} Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.063563 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a7e11fe-898a-442c-b619-d7ccea385948","Type":"ContainerStarted","Data":"35af3639b921b65729861f597174631d1ccc2f9baac748d7d2575658d3be08b9"} Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.065051 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3261c1a4-1fc3-4584-a04b-a909176a21a7","Type":"ContainerStarted","Data":"878d29e29b8947901ac3b55766798c44fc0595df994212304bdbb24caf194ff0"} Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.066612 4792 generic.go:334] "Generic (PLEG): container finished" podID="c418cc81-d55e-4678-8aad-caa1573d366a" containerID="fe493b7561a052b1ecbcb59aceb3c59fff9d9037dafcf4b3ac23d8ab22a13b4e" exitCode=0 Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.066658 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" event={"ID":"c418cc81-d55e-4678-8aad-caa1573d366a","Type":"ContainerDied","Data":"fe493b7561a052b1ecbcb59aceb3c59fff9d9037dafcf4b3ac23d8ab22a13b4e"} Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.066675 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" event={"ID":"c418cc81-d55e-4678-8aad-caa1573d366a","Type":"ContainerDied","Data":"bdfd8bffec09df88b8baa122fcaea45ce8205a2b6eb0d36b5009c9f753f43ecc"} Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.066692 4792 scope.go:117] "RemoveContainer" containerID="fe493b7561a052b1ecbcb59aceb3c59fff9d9037dafcf4b3ac23d8ab22a13b4e" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.066823 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-775688cbd9-h9lg8" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.090310 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65f4d58895-tvn59" event={"ID":"26fbd30a-a485-4463-9aac-bb695c43e9e3","Type":"ContainerStarted","Data":"6c1dbcb00cf56971be88b525c52f9576a288f8b7e09492e30c48b16b99cb4b57"} Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.099678 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f86869f48-jg6nw" event={"ID":"6c97472a-b6b7-4fc4-b872-a318812f0999","Type":"ContainerStarted","Data":"05fcac918e336fbdbec1df9312c0287b079540d558f63f641f32c22e0617f400"} Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.099734 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f86869f48-jg6nw" event={"ID":"6c97472a-b6b7-4fc4-b872-a318812f0999","Type":"ContainerStarted","Data":"6d204bf177c538e6871de5e9d16e20c8bf22d250d590cffb0112e0c43ed73d59"} Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.099747 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f86869f48-jg6nw" event={"ID":"6c97472a-b6b7-4fc4-b872-a318812f0999","Type":"ContainerStarted","Data":"20f3c43ea4550355b0c719f618fbe77a0797f616d74f172246bcf6104f3d4e32"} Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.132337 4792 generic.go:334] "Generic (PLEG): container finished" podID="b5f281f2-c77c-49cc-93a4-e7ed029f29bb" containerID="1bc8af094ffa7e776635e0ebadb742f3592eec13f050af88c7f45cc46dd6b7ae" exitCode=0 Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.132374 4792 scope.go:117] "RemoveContainer" containerID="fb4bc5fb705e29cf6511138c6ac8f7d95127aa4a1c119e960a46f1f39b113863" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.132451 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7675674687-rrbg6" event={"ID":"b5f281f2-c77c-49cc-93a4-e7ed029f29bb","Type":"ContainerDied","Data":"1bc8af094ffa7e776635e0ebadb742f3592eec13f050af88c7f45cc46dd6b7ae"} Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.132484 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7675674687-rrbg6" event={"ID":"b5f281f2-c77c-49cc-93a4-e7ed029f29bb","Type":"ContainerStarted","Data":"23c50a76020a45988e337315d0efa1b099af136e2bf3deb4ff1bf47f7e64507f"} Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.140035 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-64f98fd86b-96l6n" podStartSLOduration=3.222343751 podStartE2EDuration="6.140010607s" podCreationTimestamp="2026-03-01 09:28:49 +0000 UTC" firstStartedPulling="2026-03-01 09:28:50.682962685 +0000 UTC m=+1259.924841882" lastFinishedPulling="2026-03-01 09:28:53.600629541 +0000 UTC m=+1262.842508738" observedRunningTime="2026-03-01 09:28:55.110735269 +0000 UTC m=+1264.352614466" watchObservedRunningTime="2026-03-01 09:28:55.140010607 +0000 UTC m=+1264.381889804" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.171086 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7f86869f48-jg6nw" podStartSLOduration=2.171060759 podStartE2EDuration="2.171060759s" podCreationTimestamp="2026-03-01 09:28:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:28:55.149114051 +0000 UTC m=+1264.390993248" watchObservedRunningTime="2026-03-01 09:28:55.171060759 +0000 UTC m=+1264.412939956" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.178136 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73747536-7a61-4c63-87c7-9e4c72471fb1","Type":"ContainerStarted","Data":"b90600661de56082bcb7ae56c666abd530a56c37ee4cdb45c25503eb416e5a05"} Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.186070 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-65f4d58895-tvn59" podStartSLOduration=3.184217885 podStartE2EDuration="6.186037487s" podCreationTimestamp="2026-03-01 09:28:49 +0000 UTC" firstStartedPulling="2026-03-01 09:28:50.501273075 +0000 UTC m=+1259.743152272" lastFinishedPulling="2026-03-01 09:28:53.503092687 +0000 UTC m=+1262.744971874" observedRunningTime="2026-03-01 09:28:55.172762401 +0000 UTC m=+1264.414641598" watchObservedRunningTime="2026-03-01 09:28:55.186037487 +0000 UTC m=+1264.427916684" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.186341 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-dns-svc\") pod \"c418cc81-d55e-4678-8aad-caa1573d366a\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.186391 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-ovsdbserver-nb\") pod \"c418cc81-d55e-4678-8aad-caa1573d366a\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.186478 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-ovsdbserver-sb\") pod \"c418cc81-d55e-4678-8aad-caa1573d366a\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.186696 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcf7g\" (UniqueName: \"kubernetes.io/projected/c418cc81-d55e-4678-8aad-caa1573d366a-kube-api-access-mcf7g\") pod \"c418cc81-d55e-4678-8aad-caa1573d366a\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.187274 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-config\") pod \"c418cc81-d55e-4678-8aad-caa1573d366a\" (UID: \"c418cc81-d55e-4678-8aad-caa1573d366a\") " Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.245346 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c418cc81-d55e-4678-8aad-caa1573d366a-kube-api-access-mcf7g" (OuterVolumeSpecName: "kube-api-access-mcf7g") pod "c418cc81-d55e-4678-8aad-caa1573d366a" (UID: "c418cc81-d55e-4678-8aad-caa1573d366a"). InnerVolumeSpecName "kube-api-access-mcf7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.292895 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcf7g\" (UniqueName: \"kubernetes.io/projected/c418cc81-d55e-4678-8aad-caa1573d366a-kube-api-access-mcf7g\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.305017 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c418cc81-d55e-4678-8aad-caa1573d366a" (UID: "c418cc81-d55e-4678-8aad-caa1573d366a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.306750 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c418cc81-d55e-4678-8aad-caa1573d366a" (UID: "c418cc81-d55e-4678-8aad-caa1573d366a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.325301 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c418cc81-d55e-4678-8aad-caa1573d366a" (UID: "c418cc81-d55e-4678-8aad-caa1573d366a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.353083 4792 scope.go:117] "RemoveContainer" containerID="fe493b7561a052b1ecbcb59aceb3c59fff9d9037dafcf4b3ac23d8ab22a13b4e" Mar 01 09:28:55 crc kubenswrapper[4792]: E0301 09:28:55.356064 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe493b7561a052b1ecbcb59aceb3c59fff9d9037dafcf4b3ac23d8ab22a13b4e\": container with ID starting with fe493b7561a052b1ecbcb59aceb3c59fff9d9037dafcf4b3ac23d8ab22a13b4e not found: ID does not exist" containerID="fe493b7561a052b1ecbcb59aceb3c59fff9d9037dafcf4b3ac23d8ab22a13b4e" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.356105 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe493b7561a052b1ecbcb59aceb3c59fff9d9037dafcf4b3ac23d8ab22a13b4e"} err="failed to get container status \"fe493b7561a052b1ecbcb59aceb3c59fff9d9037dafcf4b3ac23d8ab22a13b4e\": rpc error: code = NotFound desc = could not find container \"fe493b7561a052b1ecbcb59aceb3c59fff9d9037dafcf4b3ac23d8ab22a13b4e\": container with ID starting with fe493b7561a052b1ecbcb59aceb3c59fff9d9037dafcf4b3ac23d8ab22a13b4e not found: ID does not exist" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.356133 4792 scope.go:117] "RemoveContainer" containerID="fb4bc5fb705e29cf6511138c6ac8f7d95127aa4a1c119e960a46f1f39b113863" Mar 01 09:28:55 crc kubenswrapper[4792]: E0301 09:28:55.356626 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb4bc5fb705e29cf6511138c6ac8f7d95127aa4a1c119e960a46f1f39b113863\": container with ID starting with fb4bc5fb705e29cf6511138c6ac8f7d95127aa4a1c119e960a46f1f39b113863 not found: ID does not exist" containerID="fb4bc5fb705e29cf6511138c6ac8f7d95127aa4a1c119e960a46f1f39b113863" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.356665 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb4bc5fb705e29cf6511138c6ac8f7d95127aa4a1c119e960a46f1f39b113863"} err="failed to get container status \"fb4bc5fb705e29cf6511138c6ac8f7d95127aa4a1c119e960a46f1f39b113863\": rpc error: code = NotFound desc = could not find container \"fb4bc5fb705e29cf6511138c6ac8f7d95127aa4a1c119e960a46f1f39b113863\": container with ID starting with fb4bc5fb705e29cf6511138c6ac8f7d95127aa4a1c119e960a46f1f39b113863 not found: ID does not exist" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.397995 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.398024 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.398033 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.434138 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-config" (OuterVolumeSpecName: "config") pod "c418cc81-d55e-4678-8aad-caa1573d366a" (UID: "c418cc81-d55e-4678-8aad-caa1573d366a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.500162 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c418cc81-d55e-4678-8aad-caa1573d366a-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.741564 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-775688cbd9-h9lg8"] Mar 01 09:28:55 crc kubenswrapper[4792]: I0301 09:28:55.751641 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-775688cbd9-h9lg8"] Mar 01 09:28:56 crc kubenswrapper[4792]: I0301 09:28:56.277456 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73747536-7a61-4c63-87c7-9e4c72471fb1","Type":"ContainerStarted","Data":"30490c34339b14f02bc47ec05535e3863a7df94573da07df5321b635c71d016a"} Mar 01 09:28:56 crc kubenswrapper[4792]: I0301 09:28:56.278445 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:56 crc kubenswrapper[4792]: I0301 09:28:56.278505 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:28:56 crc kubenswrapper[4792]: I0301 09:28:56.287518 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 01 09:28:57 crc kubenswrapper[4792]: I0301 09:28:57.306402 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3261c1a4-1fc3-4584-a04b-a909176a21a7","Type":"ContainerStarted","Data":"ebb3c90c66d61b3c96e538bb3a90dec662ac00998a3c455a15d931238797819c"} Mar 01 09:28:57 crc kubenswrapper[4792]: I0301 09:28:57.313793 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7675674687-rrbg6" event={"ID":"b5f281f2-c77c-49cc-93a4-e7ed029f29bb","Type":"ContainerStarted","Data":"fa470bfaa3e77451b41cb706fe5366ce1e220ca16dcf36e41c6bb50c6ad8d869"} Mar 01 09:28:57 crc kubenswrapper[4792]: I0301 09:28:57.313879 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:28:57 crc kubenswrapper[4792]: I0301 09:28:57.325301 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73747536-7a61-4c63-87c7-9e4c72471fb1","Type":"ContainerStarted","Data":"23d29fcdda3c38825545e571c946d862613874d7ef2eb012aff4b4af60d5268e"} Mar 01 09:28:57 crc kubenswrapper[4792]: I0301 09:28:57.329060 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a7e11fe-898a-442c-b619-d7ccea385948","Type":"ContainerStarted","Data":"375e2e53d30f9933a6980133a207824c3a278795869bc631327981b1d5488167"} Mar 01 09:28:57 crc kubenswrapper[4792]: I0301 09:28:57.421030 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c418cc81-d55e-4678-8aad-caa1573d366a" path="/var/lib/kubelet/pods/c418cc81-d55e-4678-8aad-caa1573d366a/volumes" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.310528 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.329665 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7675674687-rrbg6" podStartSLOduration=6.329622159 podStartE2EDuration="6.329622159s" podCreationTimestamp="2026-03-01 09:28:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:28:57.349005048 +0000 UTC m=+1266.590884245" watchObservedRunningTime="2026-03-01 09:28:58.329622159 +0000 UTC m=+1267.571501356" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.339599 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a7e11fe-898a-442c-b619-d7ccea385948","Type":"ContainerStarted","Data":"70aca8fafacb1a0424dab05fe3248c2f55093cc69642e8fa33c562316f86f9cc"} Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.341611 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3261c1a4-1fc3-4584-a04b-a909176a21a7","Type":"ContainerStarted","Data":"6a2b4266f6e00d6aee5a043cd5566817d0900e0782c4c9c3fd61fe7b047d58d1"} Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.341752 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3261c1a4-1fc3-4584-a04b-a909176a21a7" containerName="cinder-api" containerID="cri-o://6a2b4266f6e00d6aee5a043cd5566817d0900e0782c4c9c3fd61fe7b047d58d1" gracePeriod=30 Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.341881 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3261c1a4-1fc3-4584-a04b-a909176a21a7" containerName="cinder-api-log" containerID="cri-o://ebb3c90c66d61b3c96e538bb3a90dec662ac00998a3c455a15d931238797819c" gracePeriod=30 Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.342185 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.405272 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.435118022 podStartE2EDuration="6.405253165s" podCreationTimestamp="2026-03-01 09:28:52 +0000 UTC" firstStartedPulling="2026-03-01 09:28:54.460323354 +0000 UTC m=+1263.702202551" lastFinishedPulling="2026-03-01 09:28:55.430458507 +0000 UTC m=+1264.672337694" observedRunningTime="2026-03-01 09:28:58.3923952 +0000 UTC m=+1267.634274397" watchObservedRunningTime="2026-03-01 09:28:58.405253165 +0000 UTC m=+1267.647132362" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.435618 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.43560435 podStartE2EDuration="6.43560435s" podCreationTimestamp="2026-03-01 09:28:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:28:58.433359045 +0000 UTC m=+1267.675238242" watchObservedRunningTime="2026-03-01 09:28:58.43560435 +0000 UTC m=+1267.677483537" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.662089 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-66955dfdb5-6j2wx"] Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.663404 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-66955dfdb5-6j2wx" podUID="90e1a395-ebc6-49ca-9924-c64283c12ec4" containerName="neutron-httpd" containerID="cri-o://5ed4ad048c8c293d006aad77d5956e915974d390f9663c49f5f3c523ced04257" gracePeriod=30 Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.663630 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-66955dfdb5-6j2wx" podUID="90e1a395-ebc6-49ca-9924-c64283c12ec4" containerName="neutron-api" containerID="cri-o://171f60709b450c4b056a3daad4b651d3e95b5d1e5702d55092b0d107469669dc" gracePeriod=30 Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.676244 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-66955dfdb5-6j2wx" podUID="90e1a395-ebc6-49ca-9924-c64283c12ec4" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.145:9696/\": read tcp 10.217.0.2:59862->10.217.0.145:9696: read: connection reset by peer" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.717346 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6c8bdfb955-kjg92"] Mar 01 09:28:58 crc kubenswrapper[4792]: E0301 09:28:58.717830 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c418cc81-d55e-4678-8aad-caa1573d366a" containerName="dnsmasq-dns" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.717859 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c418cc81-d55e-4678-8aad-caa1573d366a" containerName="dnsmasq-dns" Mar 01 09:28:58 crc kubenswrapper[4792]: E0301 09:28:58.717874 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c418cc81-d55e-4678-8aad-caa1573d366a" containerName="init" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.717881 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c418cc81-d55e-4678-8aad-caa1573d366a" containerName="init" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.718091 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c418cc81-d55e-4678-8aad-caa1573d366a" containerName="dnsmasq-dns" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.719107 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.746354 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6c8bdfb955-kjg92"] Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.802564 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ceced30a-39e5-413f-a498-e5d4500f1eea-internal-tls-certs\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.802637 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ceced30a-39e5-413f-a498-e5d4500f1eea-httpd-config\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.802694 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhvgv\" (UniqueName: \"kubernetes.io/projected/ceced30a-39e5-413f-a498-e5d4500f1eea-kube-api-access-lhvgv\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.802735 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ceced30a-39e5-413f-a498-e5d4500f1eea-config\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.802775 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ceced30a-39e5-413f-a498-e5d4500f1eea-ovndb-tls-certs\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.802799 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ceced30a-39e5-413f-a498-e5d4500f1eea-public-tls-certs\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.802868 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceced30a-39e5-413f-a498-e5d4500f1eea-combined-ca-bundle\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.904527 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ceced30a-39e5-413f-a498-e5d4500f1eea-internal-tls-certs\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.904899 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ceced30a-39e5-413f-a498-e5d4500f1eea-httpd-config\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.904952 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhvgv\" (UniqueName: \"kubernetes.io/projected/ceced30a-39e5-413f-a498-e5d4500f1eea-kube-api-access-lhvgv\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.905484 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ceced30a-39e5-413f-a498-e5d4500f1eea-config\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.905525 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ceced30a-39e5-413f-a498-e5d4500f1eea-ovndb-tls-certs\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.905549 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ceced30a-39e5-413f-a498-e5d4500f1eea-public-tls-certs\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.905611 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceced30a-39e5-413f-a498-e5d4500f1eea-combined-ca-bundle\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.919081 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ceced30a-39e5-413f-a498-e5d4500f1eea-public-tls-certs\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.919492 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ceced30a-39e5-413f-a498-e5d4500f1eea-internal-tls-certs\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.924724 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ceced30a-39e5-413f-a498-e5d4500f1eea-ovndb-tls-certs\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.943294 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ceced30a-39e5-413f-a498-e5d4500f1eea-config\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.943314 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceced30a-39e5-413f-a498-e5d4500f1eea-combined-ca-bundle\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.952789 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ceced30a-39e5-413f-a498-e5d4500f1eea-httpd-config\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:58 crc kubenswrapper[4792]: I0301 09:28:58.966078 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhvgv\" (UniqueName: \"kubernetes.io/projected/ceced30a-39e5-413f-a498-e5d4500f1eea-kube-api-access-lhvgv\") pod \"neutron-6c8bdfb955-kjg92\" (UID: \"ceced30a-39e5-413f-a498-e5d4500f1eea\") " pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.055968 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.392082 4792 generic.go:334] "Generic (PLEG): container finished" podID="3261c1a4-1fc3-4584-a04b-a909176a21a7" containerID="6a2b4266f6e00d6aee5a043cd5566817d0900e0782c4c9c3fd61fe7b047d58d1" exitCode=0 Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.392398 4792 generic.go:334] "Generic (PLEG): container finished" podID="3261c1a4-1fc3-4584-a04b-a909176a21a7" containerID="ebb3c90c66d61b3c96e538bb3a90dec662ac00998a3c455a15d931238797819c" exitCode=143 Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.392449 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3261c1a4-1fc3-4584-a04b-a909176a21a7","Type":"ContainerDied","Data":"6a2b4266f6e00d6aee5a043cd5566817d0900e0782c4c9c3fd61fe7b047d58d1"} Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.392479 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3261c1a4-1fc3-4584-a04b-a909176a21a7","Type":"ContainerDied","Data":"ebb3c90c66d61b3c96e538bb3a90dec662ac00998a3c455a15d931238797819c"} Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.412398 4792 generic.go:334] "Generic (PLEG): container finished" podID="90e1a395-ebc6-49ca-9924-c64283c12ec4" containerID="5ed4ad048c8c293d006aad77d5956e915974d390f9663c49f5f3c523ced04257" exitCode=0 Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.472799 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66955dfdb5-6j2wx" event={"ID":"90e1a395-ebc6-49ca-9924-c64283c12ec4","Type":"ContainerDied","Data":"5ed4ad048c8c293d006aad77d5956e915974d390f9663c49f5f3c523ced04257"} Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.479695 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73747536-7a61-4c63-87c7-9e4c72471fb1","Type":"ContainerStarted","Data":"3b83ecfb929c62345bb4c7e97676547d3a0dba6dd25466d0bdaa53eaba514d91"} Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.479776 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.845064 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.878878 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.164376608 podStartE2EDuration="10.878862727s" podCreationTimestamp="2026-03-01 09:28:49 +0000 UTC" firstStartedPulling="2026-03-01 09:28:51.114443956 +0000 UTC m=+1260.356323153" lastFinishedPulling="2026-03-01 09:28:58.828930075 +0000 UTC m=+1268.070809272" observedRunningTime="2026-03-01 09:28:59.569572075 +0000 UTC m=+1268.811451262" watchObservedRunningTime="2026-03-01 09:28:59.878862727 +0000 UTC m=+1269.120741924" Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.899986 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6c8bdfb955-kjg92"] Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.943665 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-scripts\") pod \"3261c1a4-1fc3-4584-a04b-a909176a21a7\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.943740 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-config-data-custom\") pod \"3261c1a4-1fc3-4584-a04b-a909176a21a7\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.943785 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3261c1a4-1fc3-4584-a04b-a909176a21a7-logs\") pod \"3261c1a4-1fc3-4584-a04b-a909176a21a7\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.943807 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-combined-ca-bundle\") pod \"3261c1a4-1fc3-4584-a04b-a909176a21a7\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.943867 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4cnd\" (UniqueName: \"kubernetes.io/projected/3261c1a4-1fc3-4584-a04b-a909176a21a7-kube-api-access-r4cnd\") pod \"3261c1a4-1fc3-4584-a04b-a909176a21a7\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.943966 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-config-data\") pod \"3261c1a4-1fc3-4584-a04b-a909176a21a7\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.944039 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3261c1a4-1fc3-4584-a04b-a909176a21a7-etc-machine-id\") pod \"3261c1a4-1fc3-4584-a04b-a909176a21a7\" (UID: \"3261c1a4-1fc3-4584-a04b-a909176a21a7\") " Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.944604 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3261c1a4-1fc3-4584-a04b-a909176a21a7-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3261c1a4-1fc3-4584-a04b-a909176a21a7" (UID: "3261c1a4-1fc3-4584-a04b-a909176a21a7"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.949011 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3261c1a4-1fc3-4584-a04b-a909176a21a7-logs" (OuterVolumeSpecName: "logs") pod "3261c1a4-1fc3-4584-a04b-a909176a21a7" (UID: "3261c1a4-1fc3-4584-a04b-a909176a21a7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.984580 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3261c1a4-1fc3-4584-a04b-a909176a21a7" (UID: "3261c1a4-1fc3-4584-a04b-a909176a21a7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.987353 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-scripts" (OuterVolumeSpecName: "scripts") pod "3261c1a4-1fc3-4584-a04b-a909176a21a7" (UID: "3261c1a4-1fc3-4584-a04b-a909176a21a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:28:59 crc kubenswrapper[4792]: I0301 09:28:59.996064 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3261c1a4-1fc3-4584-a04b-a909176a21a7-kube-api-access-r4cnd" (OuterVolumeSpecName: "kube-api-access-r4cnd") pod "3261c1a4-1fc3-4584-a04b-a909176a21a7" (UID: "3261c1a4-1fc3-4584-a04b-a909176a21a7"). InnerVolumeSpecName "kube-api-access-r4cnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.052331 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4cnd\" (UniqueName: \"kubernetes.io/projected/3261c1a4-1fc3-4584-a04b-a909176a21a7-kube-api-access-r4cnd\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.052377 4792 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3261c1a4-1fc3-4584-a04b-a909176a21a7-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.052389 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.052400 4792 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.052412 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3261c1a4-1fc3-4584-a04b-a909176a21a7-logs\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.091619 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3261c1a4-1fc3-4584-a04b-a909176a21a7" (UID: "3261c1a4-1fc3-4584-a04b-a909176a21a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.125706 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-config-data" (OuterVolumeSpecName: "config-data") pod "3261c1a4-1fc3-4584-a04b-a909176a21a7" (UID: "3261c1a4-1fc3-4584-a04b-a909176a21a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.153932 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.153973 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3261c1a4-1fc3-4584-a04b-a909176a21a7-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.324183 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-66955dfdb5-6j2wx" podUID="90e1a395-ebc6-49ca-9924-c64283c12ec4" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.145:9696/\": dial tcp 10.217.0.145:9696: connect: connection refused" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.488843 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c8bdfb955-kjg92" event={"ID":"ceced30a-39e5-413f-a498-e5d4500f1eea","Type":"ContainerStarted","Data":"5ef35c6b23e5b2b25f4b975f83b3da467fa9da4c5c353859a75a722b4fa63404"} Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.488897 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c8bdfb955-kjg92" event={"ID":"ceced30a-39e5-413f-a498-e5d4500f1eea","Type":"ContainerStarted","Data":"167fa02eaeaac4e573ff079dd56096477c58422ab30bb1c424fbc17b68903c31"} Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.491947 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3261c1a4-1fc3-4584-a04b-a909176a21a7","Type":"ContainerDied","Data":"878d29e29b8947901ac3b55766798c44fc0595df994212304bdbb24caf194ff0"} Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.491998 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.492043 4792 scope.go:117] "RemoveContainer" containerID="6a2b4266f6e00d6aee5a043cd5566817d0900e0782c4c9c3fd61fe7b047d58d1" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.563221 4792 scope.go:117] "RemoveContainer" containerID="ebb3c90c66d61b3c96e538bb3a90dec662ac00998a3c455a15d931238797819c" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.595893 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.663242 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.679945 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 01 09:29:00 crc kubenswrapper[4792]: E0301 09:29:00.680306 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3261c1a4-1fc3-4584-a04b-a909176a21a7" containerName="cinder-api-log" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.680323 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3261c1a4-1fc3-4584-a04b-a909176a21a7" containerName="cinder-api-log" Mar 01 09:29:00 crc kubenswrapper[4792]: E0301 09:29:00.680335 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3261c1a4-1fc3-4584-a04b-a909176a21a7" containerName="cinder-api" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.680343 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3261c1a4-1fc3-4584-a04b-a909176a21a7" containerName="cinder-api" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.680509 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="3261c1a4-1fc3-4584-a04b-a909176a21a7" containerName="cinder-api" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.680529 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="3261c1a4-1fc3-4584-a04b-a909176a21a7" containerName="cinder-api-log" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.681375 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.684827 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.687343 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.688177 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.690804 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.766888 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/084f9db1-15eb-458c-8b43-aeb5dbb0555f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.767017 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/084f9db1-15eb-458c-8b43-aeb5dbb0555f-config-data\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.767049 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/084f9db1-15eb-458c-8b43-aeb5dbb0555f-logs\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.767076 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/084f9db1-15eb-458c-8b43-aeb5dbb0555f-scripts\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.767104 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zw59\" (UniqueName: \"kubernetes.io/projected/084f9db1-15eb-458c-8b43-aeb5dbb0555f-kube-api-access-4zw59\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.767150 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/084f9db1-15eb-458c-8b43-aeb5dbb0555f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.767175 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/084f9db1-15eb-458c-8b43-aeb5dbb0555f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.767222 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/084f9db1-15eb-458c-8b43-aeb5dbb0555f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.767246 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/084f9db1-15eb-458c-8b43-aeb5dbb0555f-config-data-custom\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.869271 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/084f9db1-15eb-458c-8b43-aeb5dbb0555f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.869544 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/084f9db1-15eb-458c-8b43-aeb5dbb0555f-config-data\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.869634 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/084f9db1-15eb-458c-8b43-aeb5dbb0555f-logs\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.869735 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/084f9db1-15eb-458c-8b43-aeb5dbb0555f-scripts\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.869823 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zw59\" (UniqueName: \"kubernetes.io/projected/084f9db1-15eb-458c-8b43-aeb5dbb0555f-kube-api-access-4zw59\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.869942 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/084f9db1-15eb-458c-8b43-aeb5dbb0555f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.870022 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/084f9db1-15eb-458c-8b43-aeb5dbb0555f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.870109 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/084f9db1-15eb-458c-8b43-aeb5dbb0555f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.870181 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/084f9db1-15eb-458c-8b43-aeb5dbb0555f-config-data-custom\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.869484 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/084f9db1-15eb-458c-8b43-aeb5dbb0555f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.870123 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/084f9db1-15eb-458c-8b43-aeb5dbb0555f-logs\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.919416 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/084f9db1-15eb-458c-8b43-aeb5dbb0555f-config-data-custom\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.920797 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/084f9db1-15eb-458c-8b43-aeb5dbb0555f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.921034 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/084f9db1-15eb-458c-8b43-aeb5dbb0555f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.922787 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/084f9db1-15eb-458c-8b43-aeb5dbb0555f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.922790 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zw59\" (UniqueName: \"kubernetes.io/projected/084f9db1-15eb-458c-8b43-aeb5dbb0555f-kube-api-access-4zw59\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.923646 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/084f9db1-15eb-458c-8b43-aeb5dbb0555f-config-data\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:00 crc kubenswrapper[4792]: I0301 09:29:00.926438 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/084f9db1-15eb-458c-8b43-aeb5dbb0555f-scripts\") pod \"cinder-api-0\" (UID: \"084f9db1-15eb-458c-8b43-aeb5dbb0555f\") " pod="openstack/cinder-api-0" Mar 01 09:29:01 crc kubenswrapper[4792]: I0301 09:29:01.000963 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 01 09:29:01 crc kubenswrapper[4792]: I0301 09:29:01.032723 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-686d9f9896-9zsh2" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.152:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 01 09:29:01 crc kubenswrapper[4792]: I0301 09:29:01.437257 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3261c1a4-1fc3-4584-a04b-a909176a21a7" path="/var/lib/kubelet/pods/3261c1a4-1fc3-4584-a04b-a909176a21a7/volumes" Mar 01 09:29:01 crc kubenswrapper[4792]: I0301 09:29:01.529342 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c8bdfb955-kjg92" event={"ID":"ceced30a-39e5-413f-a498-e5d4500f1eea","Type":"ContainerStarted","Data":"361e3576f7f3325679921320011c685a597478c9f150cb082a5ff7ee75562f43"} Mar 01 09:29:01 crc kubenswrapper[4792]: I0301 09:29:01.530318 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:29:01 crc kubenswrapper[4792]: I0301 09:29:01.656154 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6c8bdfb955-kjg92" podStartSLOduration=3.6561348110000003 podStartE2EDuration="3.656134811s" podCreationTimestamp="2026-03-01 09:28:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:29:01.562129293 +0000 UTC m=+1270.804008510" watchObservedRunningTime="2026-03-01 09:29:01.656134811 +0000 UTC m=+1270.898014008" Mar 01 09:29:01 crc kubenswrapper[4792]: I0301 09:29:01.656456 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 01 09:29:02 crc kubenswrapper[4792]: I0301 09:29:02.579597 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"084f9db1-15eb-458c-8b43-aeb5dbb0555f","Type":"ContainerStarted","Data":"8673a2c51f6ab102b805447b07d56e08fcc32edb4292ec021588a926f6ef7f67"} Mar 01 09:29:02 crc kubenswrapper[4792]: I0301 09:29:02.580103 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"084f9db1-15eb-458c-8b43-aeb5dbb0555f","Type":"ContainerStarted","Data":"1c0429271d7fd689692116225410decbc6ae777d83db08e21c9aa60aeac32e90"} Mar 01 09:29:02 crc kubenswrapper[4792]: I0301 09:29:02.739073 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 01 09:29:02 crc kubenswrapper[4792]: I0301 09:29:02.969103 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:29:03 crc kubenswrapper[4792]: I0301 09:29:03.032478 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fc8568c7-drpr8"] Mar 01 09:29:03 crc kubenswrapper[4792]: I0301 09:29:03.032720 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" podUID="82467164-5e77-4ea0-beee-b3a70126c075" containerName="dnsmasq-dns" containerID="cri-o://4fab450c58bb439c06a393afb4862347d838c516e4e1d271f6a55936b2e24c70" gracePeriod=10 Mar 01 09:29:03 crc kubenswrapper[4792]: I0301 09:29:03.213351 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" podUID="82467164-5e77-4ea0-beee-b3a70126c075" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.143:5353: connect: connection refused" Mar 01 09:29:03 crc kubenswrapper[4792]: I0301 09:29:03.606372 4792 generic.go:334] "Generic (PLEG): container finished" podID="82467164-5e77-4ea0-beee-b3a70126c075" containerID="4fab450c58bb439c06a393afb4862347d838c516e4e1d271f6a55936b2e24c70" exitCode=0 Mar 01 09:29:03 crc kubenswrapper[4792]: I0301 09:29:03.606794 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" event={"ID":"82467164-5e77-4ea0-beee-b3a70126c075","Type":"ContainerDied","Data":"4fab450c58bb439c06a393afb4862347d838c516e4e1d271f6a55936b2e24c70"} Mar 01 09:29:03 crc kubenswrapper[4792]: I0301 09:29:03.928963 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.030113 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-686d9f9896-9zsh2" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.152:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.046296 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-ovsdbserver-nb\") pod \"82467164-5e77-4ea0-beee-b3a70126c075\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.046343 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-ovsdbserver-sb\") pod \"82467164-5e77-4ea0-beee-b3a70126c075\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.046557 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-dns-svc\") pod \"82467164-5e77-4ea0-beee-b3a70126c075\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.046681 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-config\") pod \"82467164-5e77-4ea0-beee-b3a70126c075\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.046707 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hg4gm\" (UniqueName: \"kubernetes.io/projected/82467164-5e77-4ea0-beee-b3a70126c075-kube-api-access-hg4gm\") pod \"82467164-5e77-4ea0-beee-b3a70126c075\" (UID: \"82467164-5e77-4ea0-beee-b3a70126c075\") " Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.070290 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82467164-5e77-4ea0-beee-b3a70126c075-kube-api-access-hg4gm" (OuterVolumeSpecName: "kube-api-access-hg4gm") pod "82467164-5e77-4ea0-beee-b3a70126c075" (UID: "82467164-5e77-4ea0-beee-b3a70126c075"). InnerVolumeSpecName "kube-api-access-hg4gm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.155112 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hg4gm\" (UniqueName: \"kubernetes.io/projected/82467164-5e77-4ea0-beee-b3a70126c075-kube-api-access-hg4gm\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.222594 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "82467164-5e77-4ea0-beee-b3a70126c075" (UID: "82467164-5e77-4ea0-beee-b3a70126c075"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.229334 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "82467164-5e77-4ea0-beee-b3a70126c075" (UID: "82467164-5e77-4ea0-beee-b3a70126c075"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.230463 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "82467164-5e77-4ea0-beee-b3a70126c075" (UID: "82467164-5e77-4ea0-beee-b3a70126c075"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.259132 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.259164 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.259175 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.260350 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-config" (OuterVolumeSpecName: "config") pod "82467164-5e77-4ea0-beee-b3a70126c075" (UID: "82467164-5e77-4ea0-beee-b3a70126c075"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.360806 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82467164-5e77-4ea0-beee-b3a70126c075-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.537147 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-7f86869f48-jg6nw" podUID="6c97472a-b6b7-4fc4-b872-a318812f0999" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.157:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.537199 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="5a7e11fe-898a-442c-b619-d7ccea385948" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.537211 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-7f86869f48-jg6nw" podUID="6c97472a-b6b7-4fc4-b872-a318812f0999" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.157:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.537296 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.615666 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" event={"ID":"82467164-5e77-4ea0-beee-b3a70126c075","Type":"ContainerDied","Data":"8b9ae7283e8be3bed9877439415e05f84a3d2c818de0aa317ad4a53c2c6bc4d6"} Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.615713 4792 scope.go:117] "RemoveContainer" containerID="4fab450c58bb439c06a393afb4862347d838c516e4e1d271f6a55936b2e24c70" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.615822 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fc8568c7-drpr8" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.629788 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"084f9db1-15eb-458c-8b43-aeb5dbb0555f","Type":"ContainerStarted","Data":"a0703157303b9cf20b3406286f8ec3e802bff9f1ce1298238119085d904b579c"} Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.630000 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.652677 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fc8568c7-drpr8"] Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.654742 4792 scope.go:117] "RemoveContainer" containerID="610fd4808253a760f048f5c379d2e39f3eec919b6601aa983fb2a5f9ece83ce6" Mar 01 09:29:04 crc kubenswrapper[4792]: I0301 09:29:04.669523 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76fc8568c7-drpr8"] Mar 01 09:29:05 crc kubenswrapper[4792]: I0301 09:29:05.071183 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-686d9f9896-9zsh2" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.152:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 01 09:29:05 crc kubenswrapper[4792]: I0301 09:29:05.071253 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-686d9f9896-9zsh2" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.152:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 01 09:29:05 crc kubenswrapper[4792]: I0301 09:29:05.419018 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82467164-5e77-4ea0-beee-b3a70126c075" path="/var/lib/kubelet/pods/82467164-5e77-4ea0-beee-b3a70126c075/volumes" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.031600 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.058650 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.058634044 podStartE2EDuration="6.058634044s" podCreationTimestamp="2026-03-01 09:29:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:29:04.692440519 +0000 UTC m=+1273.934319716" watchObservedRunningTime="2026-03-01 09:29:06.058634044 +0000 UTC m=+1275.300513241" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.074089 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-686d9f9896-9zsh2" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.152:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.315087 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-84f9696594-qdwsv"] Mar 01 09:29:06 crc kubenswrapper[4792]: E0301 09:29:06.315426 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82467164-5e77-4ea0-beee-b3a70126c075" containerName="init" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.315441 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="82467164-5e77-4ea0-beee-b3a70126c075" containerName="init" Mar 01 09:29:06 crc kubenswrapper[4792]: E0301 09:29:06.315480 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82467164-5e77-4ea0-beee-b3a70126c075" containerName="dnsmasq-dns" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.315486 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="82467164-5e77-4ea0-beee-b3a70126c075" containerName="dnsmasq-dns" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.315635 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="82467164-5e77-4ea0-beee-b3a70126c075" containerName="dnsmasq-dns" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.316483 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.371678 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-84f9696594-qdwsv"] Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.412098 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18f9e703-dec0-46e1-a428-580bdb68e54e-logs\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.412193 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18f9e703-dec0-46e1-a428-580bdb68e54e-internal-tls-certs\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.412261 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qskh\" (UniqueName: \"kubernetes.io/projected/18f9e703-dec0-46e1-a428-580bdb68e54e-kube-api-access-9qskh\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.412286 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f9e703-dec0-46e1-a428-580bdb68e54e-combined-ca-bundle\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.412306 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18f9e703-dec0-46e1-a428-580bdb68e54e-scripts\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.412343 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f9e703-dec0-46e1-a428-580bdb68e54e-config-data\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.412394 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18f9e703-dec0-46e1-a428-580bdb68e54e-public-tls-certs\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.515879 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18f9e703-dec0-46e1-a428-580bdb68e54e-logs\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.515973 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18f9e703-dec0-46e1-a428-580bdb68e54e-internal-tls-certs\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.516032 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qskh\" (UniqueName: \"kubernetes.io/projected/18f9e703-dec0-46e1-a428-580bdb68e54e-kube-api-access-9qskh\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.516052 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f9e703-dec0-46e1-a428-580bdb68e54e-combined-ca-bundle\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.516070 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18f9e703-dec0-46e1-a428-580bdb68e54e-scripts\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.516099 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f9e703-dec0-46e1-a428-580bdb68e54e-config-data\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.516153 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18f9e703-dec0-46e1-a428-580bdb68e54e-public-tls-certs\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.519350 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18f9e703-dec0-46e1-a428-580bdb68e54e-logs\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.528671 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18f9e703-dec0-46e1-a428-580bdb68e54e-internal-tls-certs\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.531302 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18f9e703-dec0-46e1-a428-580bdb68e54e-scripts\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.534683 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f9e703-dec0-46e1-a428-580bdb68e54e-combined-ca-bundle\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.536458 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18f9e703-dec0-46e1-a428-580bdb68e54e-public-tls-certs\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.548566 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f9e703-dec0-46e1-a428-580bdb68e54e-config-data\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.561557 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qskh\" (UniqueName: \"kubernetes.io/projected/18f9e703-dec0-46e1-a428-580bdb68e54e-kube-api-access-9qskh\") pod \"placement-84f9696594-qdwsv\" (UID: \"18f9e703-dec0-46e1-a428-580bdb68e54e\") " pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:06 crc kubenswrapper[4792]: I0301 09:29:06.648592 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:07 crc kubenswrapper[4792]: I0301 09:29:07.209174 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-84f9696594-qdwsv"] Mar 01 09:29:07 crc kubenswrapper[4792]: I0301 09:29:07.603739 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:29:07 crc kubenswrapper[4792]: I0301 09:29:07.683470 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-84f9696594-qdwsv" event={"ID":"18f9e703-dec0-46e1-a428-580bdb68e54e","Type":"ContainerStarted","Data":"c568ea25201f17a718f34a8b7570230a932f887e8964f3c3604abb9921beee61"} Mar 01 09:29:07 crc kubenswrapper[4792]: I0301 09:29:07.683533 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-84f9696594-qdwsv" event={"ID":"18f9e703-dec0-46e1-a428-580bdb68e54e","Type":"ContainerStarted","Data":"b751d48eb0ce6cde01933eb86ab298561ebf12321c14d198ca69621f3d6bb2f9"} Mar 01 09:29:07 crc kubenswrapper[4792]: I0301 09:29:07.752312 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 01 09:29:07 crc kubenswrapper[4792]: I0301 09:29:07.795492 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 01 09:29:08 crc kubenswrapper[4792]: I0301 09:29:08.219605 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7f86869f48-jg6nw" Mar 01 09:29:08 crc kubenswrapper[4792]: I0301 09:29:08.315640 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-686d9f9896-9zsh2"] Mar 01 09:29:08 crc kubenswrapper[4792]: I0301 09:29:08.315945 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-686d9f9896-9zsh2" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api-log" containerID="cri-o://b0ab985ce4b6e50f0fbdce6821fe03c40fba76255764f0fad8a448b7b0c30943" gracePeriod=30 Mar 01 09:29:08 crc kubenswrapper[4792]: I0301 09:29:08.316456 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-686d9f9896-9zsh2" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api" containerID="cri-o://b2421256e4e0d8725502059d1b77eb867deef0339092d9a858244caa156e8a2b" gracePeriod=30 Mar 01 09:29:08 crc kubenswrapper[4792]: I0301 09:29:08.332121 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-686d9f9896-9zsh2" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.152:9311/healthcheck\": EOF" Mar 01 09:29:08 crc kubenswrapper[4792]: I0301 09:29:08.332206 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-686d9f9896-9zsh2" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.152:9311/healthcheck\": EOF" Mar 01 09:29:08 crc kubenswrapper[4792]: I0301 09:29:08.344114 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-686d9f9896-9zsh2" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.152:9311/healthcheck\": EOF" Mar 01 09:29:08 crc kubenswrapper[4792]: I0301 09:29:08.344503 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-686d9f9896-9zsh2" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.152:9311/healthcheck\": EOF" Mar 01 09:29:08 crc kubenswrapper[4792]: I0301 09:29:08.344688 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-686d9f9896-9zsh2" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.152:9311/healthcheck\": EOF" Mar 01 09:29:08 crc kubenswrapper[4792]: I0301 09:29:08.711065 4792 generic.go:334] "Generic (PLEG): container finished" podID="4721d2a8-efb5-4fa3-9779-797448455198" containerID="b0ab985ce4b6e50f0fbdce6821fe03c40fba76255764f0fad8a448b7b0c30943" exitCode=143 Mar 01 09:29:08 crc kubenswrapper[4792]: I0301 09:29:08.711152 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-686d9f9896-9zsh2" event={"ID":"4721d2a8-efb5-4fa3-9779-797448455198","Type":"ContainerDied","Data":"b0ab985ce4b6e50f0fbdce6821fe03c40fba76255764f0fad8a448b7b0c30943"} Mar 01 09:29:08 crc kubenswrapper[4792]: I0301 09:29:08.716093 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="5a7e11fe-898a-442c-b619-d7ccea385948" containerName="cinder-scheduler" containerID="cri-o://375e2e53d30f9933a6980133a207824c3a278795869bc631327981b1d5488167" gracePeriod=30 Mar 01 09:29:08 crc kubenswrapper[4792]: I0301 09:29:08.716667 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-84f9696594-qdwsv" event={"ID":"18f9e703-dec0-46e1-a428-580bdb68e54e","Type":"ContainerStarted","Data":"7b8baaece986d008808fd338ff18190d583e93de1447e07f1b636893c7f2019e"} Mar 01 09:29:08 crc kubenswrapper[4792]: I0301 09:29:08.716658 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="5a7e11fe-898a-442c-b619-d7ccea385948" containerName="probe" containerID="cri-o://70aca8fafacb1a0424dab05fe3248c2f55093cc69642e8fa33c562316f86f9cc" gracePeriod=30 Mar 01 09:29:08 crc kubenswrapper[4792]: I0301 09:29:08.717017 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:08 crc kubenswrapper[4792]: I0301 09:29:08.717056 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:08 crc kubenswrapper[4792]: I0301 09:29:08.766145 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-84f9696594-qdwsv" podStartSLOduration=2.766123392 podStartE2EDuration="2.766123392s" podCreationTimestamp="2026-03-01 09:29:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:29:08.76567031 +0000 UTC m=+1278.007549507" watchObservedRunningTime="2026-03-01 09:29:08.766123392 +0000 UTC m=+1278.008002589" Mar 01 09:29:09 crc kubenswrapper[4792]: I0301 09:29:09.542121 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-7f86869f48-jg6nw" podUID="6c97472a-b6b7-4fc4-b872-a318812f0999" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.157:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 01 09:29:09 crc kubenswrapper[4792]: I0301 09:29:09.725271 4792 generic.go:334] "Generic (PLEG): container finished" podID="5a7e11fe-898a-442c-b619-d7ccea385948" containerID="70aca8fafacb1a0424dab05fe3248c2f55093cc69642e8fa33c562316f86f9cc" exitCode=0 Mar 01 09:29:09 crc kubenswrapper[4792]: I0301 09:29:09.725357 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a7e11fe-898a-442c-b619-d7ccea385948","Type":"ContainerDied","Data":"70aca8fafacb1a0424dab05fe3248c2f55093cc69642e8fa33c562316f86f9cc"} Mar 01 09:29:09 crc kubenswrapper[4792]: I0301 09:29:09.727985 4792 generic.go:334] "Generic (PLEG): container finished" podID="90e1a395-ebc6-49ca-9924-c64283c12ec4" containerID="171f60709b450c4b056a3daad4b651d3e95b5d1e5702d55092b0d107469669dc" exitCode=0 Mar 01 09:29:09 crc kubenswrapper[4792]: I0301 09:29:09.728337 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66955dfdb5-6j2wx" event={"ID":"90e1a395-ebc6-49ca-9924-c64283c12ec4","Type":"ContainerDied","Data":"171f60709b450c4b056a3daad4b651d3e95b5d1e5702d55092b0d107469669dc"} Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.200223 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-749f685d77-ggsln" Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.746256 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.746726 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66955dfdb5-6j2wx" event={"ID":"90e1a395-ebc6-49ca-9924-c64283c12ec4","Type":"ContainerDied","Data":"0e01f5820150bb847cd98736209f40a8cfae4c1d42fc832b6f738e299bc2db88"} Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.746771 4792 scope.go:117] "RemoveContainer" containerID="5ed4ad048c8c293d006aad77d5956e915974d390f9663c49f5f3c523ced04257" Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.759286 4792 generic.go:334] "Generic (PLEG): container finished" podID="5a7e11fe-898a-442c-b619-d7ccea385948" containerID="375e2e53d30f9933a6980133a207824c3a278795869bc631327981b1d5488167" exitCode=0 Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.759331 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a7e11fe-898a-442c-b619-d7ccea385948","Type":"ContainerDied","Data":"375e2e53d30f9933a6980133a207824c3a278795869bc631327981b1d5488167"} Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.818536 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-public-tls-certs\") pod \"90e1a395-ebc6-49ca-9924-c64283c12ec4\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.818593 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8fsh\" (UniqueName: \"kubernetes.io/projected/90e1a395-ebc6-49ca-9924-c64283c12ec4-kube-api-access-h8fsh\") pod \"90e1a395-ebc6-49ca-9924-c64283c12ec4\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.818623 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-internal-tls-certs\") pod \"90e1a395-ebc6-49ca-9924-c64283c12ec4\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.818686 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-httpd-config\") pod \"90e1a395-ebc6-49ca-9924-c64283c12ec4\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.818709 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-config\") pod \"90e1a395-ebc6-49ca-9924-c64283c12ec4\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.818789 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-combined-ca-bundle\") pod \"90e1a395-ebc6-49ca-9924-c64283c12ec4\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.818897 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-ovndb-tls-certs\") pod \"90e1a395-ebc6-49ca-9924-c64283c12ec4\" (UID: \"90e1a395-ebc6-49ca-9924-c64283c12ec4\") " Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.819362 4792 scope.go:117] "RemoveContainer" containerID="171f60709b450c4b056a3daad4b651d3e95b5d1e5702d55092b0d107469669dc" Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.830473 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90e1a395-ebc6-49ca-9924-c64283c12ec4-kube-api-access-h8fsh" (OuterVolumeSpecName: "kube-api-access-h8fsh") pod "90e1a395-ebc6-49ca-9924-c64283c12ec4" (UID: "90e1a395-ebc6-49ca-9924-c64283c12ec4"). InnerVolumeSpecName "kube-api-access-h8fsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.841778 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "90e1a395-ebc6-49ca-9924-c64283c12ec4" (UID: "90e1a395-ebc6-49ca-9924-c64283c12ec4"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.903281 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "90e1a395-ebc6-49ca-9924-c64283c12ec4" (UID: "90e1a395-ebc6-49ca-9924-c64283c12ec4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.920483 4792 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.920657 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8fsh\" (UniqueName: \"kubernetes.io/projected/90e1a395-ebc6-49ca-9924-c64283c12ec4-kube-api-access-h8fsh\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.920717 4792 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:10 crc kubenswrapper[4792]: I0301 09:29:10.975460 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-config" (OuterVolumeSpecName: "config") pod "90e1a395-ebc6-49ca-9924-c64283c12ec4" (UID: "90e1a395-ebc6-49ca-9924-c64283c12ec4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.029775 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "90e1a395-ebc6-49ca-9924-c64283c12ec4" (UID: "90e1a395-ebc6-49ca-9924-c64283c12ec4"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.033154 4792 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.033183 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.046046 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "90e1a395-ebc6-49ca-9924-c64283c12ec4" (UID: "90e1a395-ebc6-49ca-9924-c64283c12ec4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.060770 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90e1a395-ebc6-49ca-9924-c64283c12ec4" (UID: "90e1a395-ebc6-49ca-9924-c64283c12ec4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.140224 4792 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.140267 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e1a395-ebc6-49ca-9924-c64283c12ec4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.203715 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.343107 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-combined-ca-bundle\") pod \"5a7e11fe-898a-442c-b619-d7ccea385948\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.343227 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-scripts\") pod \"5a7e11fe-898a-442c-b619-d7ccea385948\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.343290 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-config-data\") pod \"5a7e11fe-898a-442c-b619-d7ccea385948\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.343320 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttz97\" (UniqueName: \"kubernetes.io/projected/5a7e11fe-898a-442c-b619-d7ccea385948-kube-api-access-ttz97\") pod \"5a7e11fe-898a-442c-b619-d7ccea385948\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.343860 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a7e11fe-898a-442c-b619-d7ccea385948-etc-machine-id\") pod \"5a7e11fe-898a-442c-b619-d7ccea385948\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.343892 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-config-data-custom\") pod \"5a7e11fe-898a-442c-b619-d7ccea385948\" (UID: \"5a7e11fe-898a-442c-b619-d7ccea385948\") " Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.343977 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5a7e11fe-898a-442c-b619-d7ccea385948-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5a7e11fe-898a-442c-b619-d7ccea385948" (UID: "5a7e11fe-898a-442c-b619-d7ccea385948"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.344493 4792 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a7e11fe-898a-442c-b619-d7ccea385948-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.348386 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-scripts" (OuterVolumeSpecName: "scripts") pod "5a7e11fe-898a-442c-b619-d7ccea385948" (UID: "5a7e11fe-898a-442c-b619-d7ccea385948"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.352163 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a7e11fe-898a-442c-b619-d7ccea385948-kube-api-access-ttz97" (OuterVolumeSpecName: "kube-api-access-ttz97") pod "5a7e11fe-898a-442c-b619-d7ccea385948" (UID: "5a7e11fe-898a-442c-b619-d7ccea385948"). InnerVolumeSpecName "kube-api-access-ttz97". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.353046 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5a7e11fe-898a-442c-b619-d7ccea385948" (UID: "5a7e11fe-898a-442c-b619-d7ccea385948"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.403258 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a7e11fe-898a-442c-b619-d7ccea385948" (UID: "5a7e11fe-898a-442c-b619-d7ccea385948"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.446448 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.446481 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.446490 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttz97\" (UniqueName: \"kubernetes.io/projected/5a7e11fe-898a-442c-b619-d7ccea385948-kube-api-access-ttz97\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.446502 4792 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.451077 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-config-data" (OuterVolumeSpecName: "config-data") pod "5a7e11fe-898a-442c-b619-d7ccea385948" (UID: "5a7e11fe-898a-442c-b619-d7ccea385948"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.548544 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a7e11fe-898a-442c-b619-d7ccea385948-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.768280 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a7e11fe-898a-442c-b619-d7ccea385948","Type":"ContainerDied","Data":"35af3639b921b65729861f597174631d1ccc2f9baac748d7d2575658d3be08b9"} Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.768581 4792 scope.go:117] "RemoveContainer" containerID="70aca8fafacb1a0424dab05fe3248c2f55093cc69642e8fa33c562316f86f9cc" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.768509 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.771044 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66955dfdb5-6j2wx" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.805937 4792 scope.go:117] "RemoveContainer" containerID="375e2e53d30f9933a6980133a207824c3a278795869bc631327981b1d5488167" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.815629 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-66955dfdb5-6j2wx"] Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.834066 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-66955dfdb5-6j2wx"] Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.850966 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.863436 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.880205 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 01 09:29:11 crc kubenswrapper[4792]: E0301 09:29:11.880608 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a7e11fe-898a-442c-b619-d7ccea385948" containerName="cinder-scheduler" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.880625 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a7e11fe-898a-442c-b619-d7ccea385948" containerName="cinder-scheduler" Mar 01 09:29:11 crc kubenswrapper[4792]: E0301 09:29:11.880635 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a7e11fe-898a-442c-b619-d7ccea385948" containerName="probe" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.880641 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a7e11fe-898a-442c-b619-d7ccea385948" containerName="probe" Mar 01 09:29:11 crc kubenswrapper[4792]: E0301 09:29:11.880653 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90e1a395-ebc6-49ca-9924-c64283c12ec4" containerName="neutron-httpd" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.880659 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="90e1a395-ebc6-49ca-9924-c64283c12ec4" containerName="neutron-httpd" Mar 01 09:29:11 crc kubenswrapper[4792]: E0301 09:29:11.880671 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90e1a395-ebc6-49ca-9924-c64283c12ec4" containerName="neutron-api" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.880677 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="90e1a395-ebc6-49ca-9924-c64283c12ec4" containerName="neutron-api" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.880849 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="90e1a395-ebc6-49ca-9924-c64283c12ec4" containerName="neutron-httpd" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.880861 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a7e11fe-898a-442c-b619-d7ccea385948" containerName="probe" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.880871 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="90e1a395-ebc6-49ca-9924-c64283c12ec4" containerName="neutron-api" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.880884 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a7e11fe-898a-442c-b619-d7ccea385948" containerName="cinder-scheduler" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.881708 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.884198 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.914468 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.956062 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688f590f-ae5c-4caf-b8c7-013a118f42c5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"688f590f-ae5c-4caf-b8c7-013a118f42c5\") " pod="openstack/cinder-scheduler-0" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.956152 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/688f590f-ae5c-4caf-b8c7-013a118f42c5-scripts\") pod \"cinder-scheduler-0\" (UID: \"688f590f-ae5c-4caf-b8c7-013a118f42c5\") " pod="openstack/cinder-scheduler-0" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.956214 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk84z\" (UniqueName: \"kubernetes.io/projected/688f590f-ae5c-4caf-b8c7-013a118f42c5-kube-api-access-lk84z\") pod \"cinder-scheduler-0\" (UID: \"688f590f-ae5c-4caf-b8c7-013a118f42c5\") " pod="openstack/cinder-scheduler-0" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.956268 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/688f590f-ae5c-4caf-b8c7-013a118f42c5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"688f590f-ae5c-4caf-b8c7-013a118f42c5\") " pod="openstack/cinder-scheduler-0" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.956384 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/688f590f-ae5c-4caf-b8c7-013a118f42c5-config-data\") pod \"cinder-scheduler-0\" (UID: \"688f590f-ae5c-4caf-b8c7-013a118f42c5\") " pod="openstack/cinder-scheduler-0" Mar 01 09:29:11 crc kubenswrapper[4792]: I0301 09:29:11.956423 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/688f590f-ae5c-4caf-b8c7-013a118f42c5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"688f590f-ae5c-4caf-b8c7-013a118f42c5\") " pod="openstack/cinder-scheduler-0" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.057926 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/688f590f-ae5c-4caf-b8c7-013a118f42c5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"688f590f-ae5c-4caf-b8c7-013a118f42c5\") " pod="openstack/cinder-scheduler-0" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.058026 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/688f590f-ae5c-4caf-b8c7-013a118f42c5-config-data\") pod \"cinder-scheduler-0\" (UID: \"688f590f-ae5c-4caf-b8c7-013a118f42c5\") " pod="openstack/cinder-scheduler-0" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.058066 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/688f590f-ae5c-4caf-b8c7-013a118f42c5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"688f590f-ae5c-4caf-b8c7-013a118f42c5\") " pod="openstack/cinder-scheduler-0" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.058117 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688f590f-ae5c-4caf-b8c7-013a118f42c5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"688f590f-ae5c-4caf-b8c7-013a118f42c5\") " pod="openstack/cinder-scheduler-0" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.058135 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/688f590f-ae5c-4caf-b8c7-013a118f42c5-scripts\") pod \"cinder-scheduler-0\" (UID: \"688f590f-ae5c-4caf-b8c7-013a118f42c5\") " pod="openstack/cinder-scheduler-0" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.058165 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk84z\" (UniqueName: \"kubernetes.io/projected/688f590f-ae5c-4caf-b8c7-013a118f42c5-kube-api-access-lk84z\") pod \"cinder-scheduler-0\" (UID: \"688f590f-ae5c-4caf-b8c7-013a118f42c5\") " pod="openstack/cinder-scheduler-0" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.058457 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/688f590f-ae5c-4caf-b8c7-013a118f42c5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"688f590f-ae5c-4caf-b8c7-013a118f42c5\") " pod="openstack/cinder-scheduler-0" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.064145 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/688f590f-ae5c-4caf-b8c7-013a118f42c5-scripts\") pod \"cinder-scheduler-0\" (UID: \"688f590f-ae5c-4caf-b8c7-013a118f42c5\") " pod="openstack/cinder-scheduler-0" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.064833 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688f590f-ae5c-4caf-b8c7-013a118f42c5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"688f590f-ae5c-4caf-b8c7-013a118f42c5\") " pod="openstack/cinder-scheduler-0" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.066041 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/688f590f-ae5c-4caf-b8c7-013a118f42c5-config-data\") pod \"cinder-scheduler-0\" (UID: \"688f590f-ae5c-4caf-b8c7-013a118f42c5\") " pod="openstack/cinder-scheduler-0" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.073445 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/688f590f-ae5c-4caf-b8c7-013a118f42c5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"688f590f-ae5c-4caf-b8c7-013a118f42c5\") " pod="openstack/cinder-scheduler-0" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.099172 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk84z\" (UniqueName: \"kubernetes.io/projected/688f590f-ae5c-4caf-b8c7-013a118f42c5-kube-api-access-lk84z\") pod \"cinder-scheduler-0\" (UID: \"688f590f-ae5c-4caf-b8c7-013a118f42c5\") " pod="openstack/cinder-scheduler-0" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.197092 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.488073 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.489512 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.493336 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.493863 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.496221 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-f6v2c" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.539984 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.572914 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5bj9\" (UniqueName: \"kubernetes.io/projected/fecafda6-dcf9-46ea-8678-8da499154ad7-kube-api-access-p5bj9\") pod \"openstackclient\" (UID: \"fecafda6-dcf9-46ea-8678-8da499154ad7\") " pod="openstack/openstackclient" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.573059 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fecafda6-dcf9-46ea-8678-8da499154ad7-openstack-config-secret\") pod \"openstackclient\" (UID: \"fecafda6-dcf9-46ea-8678-8da499154ad7\") " pod="openstack/openstackclient" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.573095 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fecafda6-dcf9-46ea-8678-8da499154ad7-openstack-config\") pod \"openstackclient\" (UID: \"fecafda6-dcf9-46ea-8678-8da499154ad7\") " pod="openstack/openstackclient" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.573155 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fecafda6-dcf9-46ea-8678-8da499154ad7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fecafda6-dcf9-46ea-8678-8da499154ad7\") " pod="openstack/openstackclient" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.675587 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fecafda6-dcf9-46ea-8678-8da499154ad7-openstack-config-secret\") pod \"openstackclient\" (UID: \"fecafda6-dcf9-46ea-8678-8da499154ad7\") " pod="openstack/openstackclient" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.675656 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fecafda6-dcf9-46ea-8678-8da499154ad7-openstack-config\") pod \"openstackclient\" (UID: \"fecafda6-dcf9-46ea-8678-8da499154ad7\") " pod="openstack/openstackclient" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.675709 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fecafda6-dcf9-46ea-8678-8da499154ad7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fecafda6-dcf9-46ea-8678-8da499154ad7\") " pod="openstack/openstackclient" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.675742 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5bj9\" (UniqueName: \"kubernetes.io/projected/fecafda6-dcf9-46ea-8678-8da499154ad7-kube-api-access-p5bj9\") pod \"openstackclient\" (UID: \"fecafda6-dcf9-46ea-8678-8da499154ad7\") " pod="openstack/openstackclient" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.677601 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fecafda6-dcf9-46ea-8678-8da499154ad7-openstack-config\") pod \"openstackclient\" (UID: \"fecafda6-dcf9-46ea-8678-8da499154ad7\") " pod="openstack/openstackclient" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.680376 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fecafda6-dcf9-46ea-8678-8da499154ad7-openstack-config-secret\") pod \"openstackclient\" (UID: \"fecafda6-dcf9-46ea-8678-8da499154ad7\") " pod="openstack/openstackclient" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.687641 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fecafda6-dcf9-46ea-8678-8da499154ad7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fecafda6-dcf9-46ea-8678-8da499154ad7\") " pod="openstack/openstackclient" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.735984 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5bj9\" (UniqueName: \"kubernetes.io/projected/fecafda6-dcf9-46ea-8678-8da499154ad7-kube-api-access-p5bj9\") pod \"openstackclient\" (UID: \"fecafda6-dcf9-46ea-8678-8da499154ad7\") " pod="openstack/openstackclient" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.823205 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 01 09:29:12 crc kubenswrapper[4792]: I0301 09:29:12.908376 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 01 09:29:12 crc kubenswrapper[4792]: W0301 09:29:12.912549 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod688f590f_ae5c_4caf_b8c7_013a118f42c5.slice/crio-7e55159b365aaa12efa912d1ed7a3716f8dae566216ff02c51088f4e8c77d169 WatchSource:0}: Error finding container 7e55159b365aaa12efa912d1ed7a3716f8dae566216ff02c51088f4e8c77d169: Status 404 returned error can't find the container with id 7e55159b365aaa12efa912d1ed7a3716f8dae566216ff02c51088f4e8c77d169 Mar 01 09:29:13 crc kubenswrapper[4792]: I0301 09:29:13.379657 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 01 09:29:13 crc kubenswrapper[4792]: I0301 09:29:13.385193 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-686d9f9896-9zsh2" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.152:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 01 09:29:13 crc kubenswrapper[4792]: I0301 09:29:13.385314 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:29:13 crc kubenswrapper[4792]: W0301 09:29:13.398294 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfecafda6_dcf9_46ea_8678_8da499154ad7.slice/crio-945cdd534bc89f7e8b59ea5fd1ca3574fe34009c41b218483277963fb4e4ef6c WatchSource:0}: Error finding container 945cdd534bc89f7e8b59ea5fd1ca3574fe34009c41b218483277963fb4e4ef6c: Status 404 returned error can't find the container with id 945cdd534bc89f7e8b59ea5fd1ca3574fe34009c41b218483277963fb4e4ef6c Mar 01 09:29:13 crc kubenswrapper[4792]: I0301 09:29:13.419133 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a7e11fe-898a-442c-b619-d7ccea385948" path="/var/lib/kubelet/pods/5a7e11fe-898a-442c-b619-d7ccea385948/volumes" Mar 01 09:29:13 crc kubenswrapper[4792]: I0301 09:29:13.419870 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90e1a395-ebc6-49ca-9924-c64283c12ec4" path="/var/lib/kubelet/pods/90e1a395-ebc6-49ca-9924-c64283c12ec4/volumes" Mar 01 09:29:13 crc kubenswrapper[4792]: I0301 09:29:13.796662 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"688f590f-ae5c-4caf-b8c7-013a118f42c5","Type":"ContainerStarted","Data":"33ccd7b9bd7cd18961dff5220e934d271bdba777b71b912d25cd7d44e78f7eeb"} Mar 01 09:29:13 crc kubenswrapper[4792]: I0301 09:29:13.797069 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"688f590f-ae5c-4caf-b8c7-013a118f42c5","Type":"ContainerStarted","Data":"7e55159b365aaa12efa912d1ed7a3716f8dae566216ff02c51088f4e8c77d169"} Mar 01 09:29:13 crc kubenswrapper[4792]: I0301 09:29:13.800399 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"fecafda6-dcf9-46ea-8678-8da499154ad7","Type":"ContainerStarted","Data":"945cdd534bc89f7e8b59ea5fd1ca3574fe34009c41b218483277963fb4e4ef6c"} Mar 01 09:29:14 crc kubenswrapper[4792]: I0301 09:29:14.811969 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"688f590f-ae5c-4caf-b8c7-013a118f42c5","Type":"ContainerStarted","Data":"06eb3da651cd4d4a1436808973840acba906224bea877ac66132b9f76f86965a"} Mar 01 09:29:14 crc kubenswrapper[4792]: I0301 09:29:14.838475 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.838459822 podStartE2EDuration="3.838459822s" podCreationTimestamp="2026-03-01 09:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:29:14.836524464 +0000 UTC m=+1284.078403661" watchObservedRunningTime="2026-03-01 09:29:14.838459822 +0000 UTC m=+1284.080339019" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.031451 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-686d9f9896-9zsh2" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.152:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.079343 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="084f9db1-15eb-458c-8b43-aeb5dbb0555f" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.159:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.079875 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-686d9f9896-9zsh2" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.152:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.092067 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-686d9f9896-9zsh2" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.152:9311/healthcheck\": read tcp 10.217.0.2:34784->10.217.0.152:9311: read: connection reset by peer" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.092108 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-686d9f9896-9zsh2" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.152:9311/healthcheck\": read tcp 10.217.0.2:34794->10.217.0.152:9311: read: connection reset by peer" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.597588 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.609579 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.656332 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4721d2a8-efb5-4fa3-9779-797448455198-config-data\") pod \"4721d2a8-efb5-4fa3-9779-797448455198\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.656411 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58r6b\" (UniqueName: \"kubernetes.io/projected/4721d2a8-efb5-4fa3-9779-797448455198-kube-api-access-58r6b\") pod \"4721d2a8-efb5-4fa3-9779-797448455198\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.656507 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4721d2a8-efb5-4fa3-9779-797448455198-config-data-custom\") pod \"4721d2a8-efb5-4fa3-9779-797448455198\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.656535 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4721d2a8-efb5-4fa3-9779-797448455198-combined-ca-bundle\") pod \"4721d2a8-efb5-4fa3-9779-797448455198\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.656584 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4721d2a8-efb5-4fa3-9779-797448455198-logs\") pod \"4721d2a8-efb5-4fa3-9779-797448455198\" (UID: \"4721d2a8-efb5-4fa3-9779-797448455198\") " Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.658104 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4721d2a8-efb5-4fa3-9779-797448455198-logs" (OuterVolumeSpecName: "logs") pod "4721d2a8-efb5-4fa3-9779-797448455198" (UID: "4721d2a8-efb5-4fa3-9779-797448455198"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.673558 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4721d2a8-efb5-4fa3-9779-797448455198-kube-api-access-58r6b" (OuterVolumeSpecName: "kube-api-access-58r6b") pod "4721d2a8-efb5-4fa3-9779-797448455198" (UID: "4721d2a8-efb5-4fa3-9779-797448455198"). InnerVolumeSpecName "kube-api-access-58r6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.675234 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4721d2a8-efb5-4fa3-9779-797448455198-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4721d2a8-efb5-4fa3-9779-797448455198" (UID: "4721d2a8-efb5-4fa3-9779-797448455198"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.757880 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4721d2a8-efb5-4fa3-9779-797448455198-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4721d2a8-efb5-4fa3-9779-797448455198" (UID: "4721d2a8-efb5-4fa3-9779-797448455198"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.759672 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4721d2a8-efb5-4fa3-9779-797448455198-logs\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.759696 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58r6b\" (UniqueName: \"kubernetes.io/projected/4721d2a8-efb5-4fa3-9779-797448455198-kube-api-access-58r6b\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.759707 4792 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4721d2a8-efb5-4fa3-9779-797448455198-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.759714 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4721d2a8-efb5-4fa3-9779-797448455198-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.769847 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4721d2a8-efb5-4fa3-9779-797448455198-config-data" (OuterVolumeSpecName: "config-data") pod "4721d2a8-efb5-4fa3-9779-797448455198" (UID: "4721d2a8-efb5-4fa3-9779-797448455198"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.857123 4792 generic.go:334] "Generic (PLEG): container finished" podID="4721d2a8-efb5-4fa3-9779-797448455198" containerID="b2421256e4e0d8725502059d1b77eb867deef0339092d9a858244caa156e8a2b" exitCode=0 Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.858116 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-686d9f9896-9zsh2" event={"ID":"4721d2a8-efb5-4fa3-9779-797448455198","Type":"ContainerDied","Data":"b2421256e4e0d8725502059d1b77eb867deef0339092d9a858244caa156e8a2b"} Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.858147 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-686d9f9896-9zsh2" event={"ID":"4721d2a8-efb5-4fa3-9779-797448455198","Type":"ContainerDied","Data":"128fecf3528e96b978b34551abfcb59be5f329262f2b7798f19013b2931fb841"} Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.858164 4792 scope.go:117] "RemoveContainer" containerID="b2421256e4e0d8725502059d1b77eb867deef0339092d9a858244caa156e8a2b" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.858266 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-686d9f9896-9zsh2" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.863103 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4721d2a8-efb5-4fa3-9779-797448455198-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.922964 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-686d9f9896-9zsh2"] Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.957882 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-686d9f9896-9zsh2"] Mar 01 09:29:15 crc kubenswrapper[4792]: I0301 09:29:15.962026 4792 scope.go:117] "RemoveContainer" containerID="b0ab985ce4b6e50f0fbdce6821fe03c40fba76255764f0fad8a448b7b0c30943" Mar 01 09:29:16 crc kubenswrapper[4792]: I0301 09:29:16.042281 4792 scope.go:117] "RemoveContainer" containerID="b2421256e4e0d8725502059d1b77eb867deef0339092d9a858244caa156e8a2b" Mar 01 09:29:16 crc kubenswrapper[4792]: E0301 09:29:16.051509 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2421256e4e0d8725502059d1b77eb867deef0339092d9a858244caa156e8a2b\": container with ID starting with b2421256e4e0d8725502059d1b77eb867deef0339092d9a858244caa156e8a2b not found: ID does not exist" containerID="b2421256e4e0d8725502059d1b77eb867deef0339092d9a858244caa156e8a2b" Mar 01 09:29:16 crc kubenswrapper[4792]: I0301 09:29:16.051600 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2421256e4e0d8725502059d1b77eb867deef0339092d9a858244caa156e8a2b"} err="failed to get container status \"b2421256e4e0d8725502059d1b77eb867deef0339092d9a858244caa156e8a2b\": rpc error: code = NotFound desc = could not find container \"b2421256e4e0d8725502059d1b77eb867deef0339092d9a858244caa156e8a2b\": container with ID starting with b2421256e4e0d8725502059d1b77eb867deef0339092d9a858244caa156e8a2b not found: ID does not exist" Mar 01 09:29:16 crc kubenswrapper[4792]: I0301 09:29:16.051629 4792 scope.go:117] "RemoveContainer" containerID="b0ab985ce4b6e50f0fbdce6821fe03c40fba76255764f0fad8a448b7b0c30943" Mar 01 09:29:16 crc kubenswrapper[4792]: E0301 09:29:16.053340 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0ab985ce4b6e50f0fbdce6821fe03c40fba76255764f0fad8a448b7b0c30943\": container with ID starting with b0ab985ce4b6e50f0fbdce6821fe03c40fba76255764f0fad8a448b7b0c30943 not found: ID does not exist" containerID="b0ab985ce4b6e50f0fbdce6821fe03c40fba76255764f0fad8a448b7b0c30943" Mar 01 09:29:16 crc kubenswrapper[4792]: I0301 09:29:16.053380 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0ab985ce4b6e50f0fbdce6821fe03c40fba76255764f0fad8a448b7b0c30943"} err="failed to get container status \"b0ab985ce4b6e50f0fbdce6821fe03c40fba76255764f0fad8a448b7b0c30943\": rpc error: code = NotFound desc = could not find container \"b0ab985ce4b6e50f0fbdce6821fe03c40fba76255764f0fad8a448b7b0c30943\": container with ID starting with b0ab985ce4b6e50f0fbdce6821fe03c40fba76255764f0fad8a448b7b0c30943 not found: ID does not exist" Mar 01 09:29:17 crc kubenswrapper[4792]: I0301 09:29:17.197836 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 01 09:29:17 crc kubenswrapper[4792]: I0301 09:29:17.417285 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4721d2a8-efb5-4fa3-9779-797448455198" path="/var/lib/kubelet/pods/4721d2a8-efb5-4fa3-9779-797448455198/volumes" Mar 01 09:29:20 crc kubenswrapper[4792]: I0301 09:29:20.421709 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 01 09:29:22 crc kubenswrapper[4792]: I0301 09:29:22.426882 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 01 09:29:23 crc kubenswrapper[4792]: I0301 09:29:23.124709 4792 scope.go:117] "RemoveContainer" containerID="b3416cff442b7b3bec1893fb5c0aa2d61087db5d4679dae5ed62f8ea4a150ca7" Mar 01 09:29:24 crc kubenswrapper[4792]: I0301 09:29:24.607201 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:24 crc kubenswrapper[4792]: I0301 09:29:24.607452 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerName="ceilometer-central-agent" containerID="cri-o://b90600661de56082bcb7ae56c666abd530a56c37ee4cdb45c25503eb416e5a05" gracePeriod=30 Mar 01 09:29:24 crc kubenswrapper[4792]: I0301 09:29:24.607539 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerName="proxy-httpd" containerID="cri-o://3b83ecfb929c62345bb4c7e97676547d3a0dba6dd25466d0bdaa53eaba514d91" gracePeriod=30 Mar 01 09:29:24 crc kubenswrapper[4792]: I0301 09:29:24.607566 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerName="ceilometer-notification-agent" containerID="cri-o://30490c34339b14f02bc47ec05535e3863a7df94573da07df5321b635c71d016a" gracePeriod=30 Mar 01 09:29:24 crc kubenswrapper[4792]: I0301 09:29:24.607574 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerName="sg-core" containerID="cri-o://23d29fcdda3c38825545e571c946d862613874d7ef2eb012aff4b4af60d5268e" gracePeriod=30 Mar 01 09:29:24 crc kubenswrapper[4792]: I0301 09:29:24.950302 4792 generic.go:334] "Generic (PLEG): container finished" podID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerID="3b83ecfb929c62345bb4c7e97676547d3a0dba6dd25466d0bdaa53eaba514d91" exitCode=0 Mar 01 09:29:24 crc kubenswrapper[4792]: I0301 09:29:24.950552 4792 generic.go:334] "Generic (PLEG): container finished" podID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerID="23d29fcdda3c38825545e571c946d862613874d7ef2eb012aff4b4af60d5268e" exitCode=2 Mar 01 09:29:24 crc kubenswrapper[4792]: I0301 09:29:24.950582 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73747536-7a61-4c63-87c7-9e4c72471fb1","Type":"ContainerDied","Data":"3b83ecfb929c62345bb4c7e97676547d3a0dba6dd25466d0bdaa53eaba514d91"} Mar 01 09:29:24 crc kubenswrapper[4792]: I0301 09:29:24.950607 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73747536-7a61-4c63-87c7-9e4c72471fb1","Type":"ContainerDied","Data":"23d29fcdda3c38825545e571c946d862613874d7ef2eb012aff4b4af60d5268e"} Mar 01 09:29:25 crc kubenswrapper[4792]: I0301 09:29:25.961693 4792 generic.go:334] "Generic (PLEG): container finished" podID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerID="30490c34339b14f02bc47ec05535e3863a7df94573da07df5321b635c71d016a" exitCode=0 Mar 01 09:29:25 crc kubenswrapper[4792]: I0301 09:29:25.961731 4792 generic.go:334] "Generic (PLEG): container finished" podID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerID="b90600661de56082bcb7ae56c666abd530a56c37ee4cdb45c25503eb416e5a05" exitCode=0 Mar 01 09:29:25 crc kubenswrapper[4792]: I0301 09:29:25.961754 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73747536-7a61-4c63-87c7-9e4c72471fb1","Type":"ContainerDied","Data":"30490c34339b14f02bc47ec05535e3863a7df94573da07df5321b635c71d016a"} Mar 01 09:29:25 crc kubenswrapper[4792]: I0301 09:29:25.961781 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73747536-7a61-4c63-87c7-9e4c72471fb1","Type":"ContainerDied","Data":"b90600661de56082bcb7ae56c666abd530a56c37ee4cdb45c25503eb416e5a05"} Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.085214 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.169042 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-config-data\") pod \"73747536-7a61-4c63-87c7-9e4c72471fb1\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.169104 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-scripts\") pod \"73747536-7a61-4c63-87c7-9e4c72471fb1\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.169138 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-combined-ca-bundle\") pod \"73747536-7a61-4c63-87c7-9e4c72471fb1\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.169223 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-sg-core-conf-yaml\") pod \"73747536-7a61-4c63-87c7-9e4c72471fb1\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.169284 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpnn8\" (UniqueName: \"kubernetes.io/projected/73747536-7a61-4c63-87c7-9e4c72471fb1-kube-api-access-kpnn8\") pod \"73747536-7a61-4c63-87c7-9e4c72471fb1\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.169366 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73747536-7a61-4c63-87c7-9e4c72471fb1-log-httpd\") pod \"73747536-7a61-4c63-87c7-9e4c72471fb1\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.169610 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73747536-7a61-4c63-87c7-9e4c72471fb1-run-httpd\") pod \"73747536-7a61-4c63-87c7-9e4c72471fb1\" (UID: \"73747536-7a61-4c63-87c7-9e4c72471fb1\") " Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.171065 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73747536-7a61-4c63-87c7-9e4c72471fb1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "73747536-7a61-4c63-87c7-9e4c72471fb1" (UID: "73747536-7a61-4c63-87c7-9e4c72471fb1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.189148 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73747536-7a61-4c63-87c7-9e4c72471fb1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "73747536-7a61-4c63-87c7-9e4c72471fb1" (UID: "73747536-7a61-4c63-87c7-9e4c72471fb1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.201211 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73747536-7a61-4c63-87c7-9e4c72471fb1-kube-api-access-kpnn8" (OuterVolumeSpecName: "kube-api-access-kpnn8") pod "73747536-7a61-4c63-87c7-9e4c72471fb1" (UID: "73747536-7a61-4c63-87c7-9e4c72471fb1"). InnerVolumeSpecName "kube-api-access-kpnn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.204150 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-scripts" (OuterVolumeSpecName: "scripts") pod "73747536-7a61-4c63-87c7-9e4c72471fb1" (UID: "73747536-7a61-4c63-87c7-9e4c72471fb1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.274483 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.274513 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpnn8\" (UniqueName: \"kubernetes.io/projected/73747536-7a61-4c63-87c7-9e4c72471fb1-kube-api-access-kpnn8\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.274522 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73747536-7a61-4c63-87c7-9e4c72471fb1-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.274533 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73747536-7a61-4c63-87c7-9e4c72471fb1-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.292119 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "73747536-7a61-4c63-87c7-9e4c72471fb1" (UID: "73747536-7a61-4c63-87c7-9e4c72471fb1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.376548 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.480366 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73747536-7a61-4c63-87c7-9e4c72471fb1" (UID: "73747536-7a61-4c63-87c7-9e4c72471fb1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.487060 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-config-data" (OuterVolumeSpecName: "config-data") pod "73747536-7a61-4c63-87c7-9e4c72471fb1" (UID: "73747536-7a61-4c63-87c7-9e4c72471fb1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.579181 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.579217 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73747536-7a61-4c63-87c7-9e4c72471fb1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.982681 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.982676 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73747536-7a61-4c63-87c7-9e4c72471fb1","Type":"ContainerDied","Data":"8f407749323a926af0db11e4921c8f80c0b44788d7a0172e925467426ce4a55c"} Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.982967 4792 scope.go:117] "RemoveContainer" containerID="3b83ecfb929c62345bb4c7e97676547d3a0dba6dd25466d0bdaa53eaba514d91" Mar 01 09:29:27 crc kubenswrapper[4792]: I0301 09:29:27.985029 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"fecafda6-dcf9-46ea-8678-8da499154ad7","Type":"ContainerStarted","Data":"2eee12d1f27a3adae4e9750f69a737edde8e026f3558c981659a70a138181bbd"} Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.001113 4792 scope.go:117] "RemoveContainer" containerID="23d29fcdda3c38825545e571c946d862613874d7ef2eb012aff4b4af60d5268e" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.016169 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.7124536 podStartE2EDuration="16.01614824s" podCreationTimestamp="2026-03-01 09:29:12 +0000 UTC" firstStartedPulling="2026-03-01 09:29:13.403697735 +0000 UTC m=+1282.645576932" lastFinishedPulling="2026-03-01 09:29:26.707392375 +0000 UTC m=+1295.949271572" observedRunningTime="2026-03-01 09:29:28.003458588 +0000 UTC m=+1297.245337785" watchObservedRunningTime="2026-03-01 09:29:28.01614824 +0000 UTC m=+1297.258027437" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.035831 4792 scope.go:117] "RemoveContainer" containerID="30490c34339b14f02bc47ec05535e3863a7df94573da07df5321b635c71d016a" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.080938 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.101062 4792 scope.go:117] "RemoveContainer" containerID="b90600661de56082bcb7ae56c666abd530a56c37ee4cdb45c25503eb416e5a05" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.103400 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.115494 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:28 crc kubenswrapper[4792]: E0301 09:29:28.115946 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerName="sg-core" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.115963 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerName="sg-core" Mar 01 09:29:28 crc kubenswrapper[4792]: E0301 09:29:28.115983 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.116007 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api" Mar 01 09:29:28 crc kubenswrapper[4792]: E0301 09:29:28.116017 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerName="ceilometer-notification-agent" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.116025 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerName="ceilometer-notification-agent" Mar 01 09:29:28 crc kubenswrapper[4792]: E0301 09:29:28.116046 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerName="ceilometer-central-agent" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.116053 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerName="ceilometer-central-agent" Mar 01 09:29:28 crc kubenswrapper[4792]: E0301 09:29:28.116068 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerName="proxy-httpd" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.116075 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerName="proxy-httpd" Mar 01 09:29:28 crc kubenswrapper[4792]: E0301 09:29:28.116086 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api-log" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.116093 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api-log" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.116308 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api-log" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.116329 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerName="ceilometer-central-agent" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.116340 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerName="proxy-httpd" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.116354 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4721d2a8-efb5-4fa3-9779-797448455198" containerName="barbican-api" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.116370 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerName="sg-core" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.116384 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="73747536-7a61-4c63-87c7-9e4c72471fb1" containerName="ceilometer-notification-agent" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.121433 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.124001 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.125383 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.152498 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.292784 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-scripts\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.292831 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-log-httpd\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.292857 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.292883 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptvq9\" (UniqueName: \"kubernetes.io/projected/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-kube-api-access-ptvq9\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.292972 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.293005 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-config-data\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.293297 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-run-httpd\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.394837 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-run-httpd\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.395154 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-scripts\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.395183 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-log-httpd\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.395202 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.395226 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptvq9\" (UniqueName: \"kubernetes.io/projected/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-kube-api-access-ptvq9\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.395267 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.395284 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-config-data\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.395344 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-run-httpd\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.395671 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-log-httpd\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.398503 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.399742 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-config-data\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.400663 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-scripts\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.401527 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.422287 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptvq9\" (UniqueName: \"kubernetes.io/projected/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-kube-api-access-ptvq9\") pod \"ceilometer-0\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " pod="openstack/ceilometer-0" Mar 01 09:29:28 crc kubenswrapper[4792]: I0301 09:29:28.448485 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:29:29 crc kubenswrapper[4792]: I0301 09:29:29.007614 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:29 crc kubenswrapper[4792]: I0301 09:29:29.017300 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"594710f1-32aa-4acc-a8ea-8cfec7b2c28c","Type":"ContainerStarted","Data":"a26311f7480277741aeeb123ba5ce9e74cfc4cc3c5c9582b395475f7202f8016"} Mar 01 09:29:29 crc kubenswrapper[4792]: I0301 09:29:29.089070 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6c8bdfb955-kjg92" Mar 01 09:29:29 crc kubenswrapper[4792]: I0301 09:29:29.166929 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7bbc5b86d6-8b672"] Mar 01 09:29:29 crc kubenswrapper[4792]: I0301 09:29:29.167169 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7bbc5b86d6-8b672" podUID="013566fd-5627-422a-809a-e81a8ec059d9" containerName="neutron-api" containerID="cri-o://ca55616f2e5de805229eb50d3f643de3b33e4b53039ccf7569dc0337fc8e14a5" gracePeriod=30 Mar 01 09:29:29 crc kubenswrapper[4792]: I0301 09:29:29.167602 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7bbc5b86d6-8b672" podUID="013566fd-5627-422a-809a-e81a8ec059d9" containerName="neutron-httpd" containerID="cri-o://2f7d0d1b6918c9e962de8e492d201469a86475bd623f4086bd8a9e1f30a74d63" gracePeriod=30 Mar 01 09:29:29 crc kubenswrapper[4792]: I0301 09:29:29.476231 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73747536-7a61-4c63-87c7-9e4c72471fb1" path="/var/lib/kubelet/pods/73747536-7a61-4c63-87c7-9e4c72471fb1/volumes" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.025112 4792 generic.go:334] "Generic (PLEG): container finished" podID="013566fd-5627-422a-809a-e81a8ec059d9" containerID="2f7d0d1b6918c9e962de8e492d201469a86475bd623f4086bd8a9e1f30a74d63" exitCode=0 Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.025183 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bbc5b86d6-8b672" event={"ID":"013566fd-5627-422a-809a-e81a8ec059d9","Type":"ContainerDied","Data":"2f7d0d1b6918c9e962de8e492d201469a86475bd623f4086bd8a9e1f30a74d63"} Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.615223 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-zj224"] Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.616715 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zj224" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.642636 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-zj224"] Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.719790 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-qt42r"] Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.726491 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qt42r" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.741467 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-qt42r"] Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.749442 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z52tq\" (UniqueName: \"kubernetes.io/projected/a069955e-f546-4522-97ec-5a529f79b1aa-kube-api-access-z52tq\") pod \"nova-api-db-create-zj224\" (UID: \"a069955e-f546-4522-97ec-5a529f79b1aa\") " pod="openstack/nova-api-db-create-zj224" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.749545 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a069955e-f546-4522-97ec-5a529f79b1aa-operator-scripts\") pod \"nova-api-db-create-zj224\" (UID: \"a069955e-f546-4522-97ec-5a529f79b1aa\") " pod="openstack/nova-api-db-create-zj224" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.810800 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-8f2r2"] Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.811868 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8f2r2" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.823309 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-474a-account-create-update-dlgkl"] Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.824664 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-474a-account-create-update-dlgkl" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.828550 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.838054 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-8f2r2"] Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.851060 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a069955e-f546-4522-97ec-5a529f79b1aa-operator-scripts\") pod \"nova-api-db-create-zj224\" (UID: \"a069955e-f546-4522-97ec-5a529f79b1aa\") " pod="openstack/nova-api-db-create-zj224" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.851152 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj6j5\" (UniqueName: \"kubernetes.io/projected/21b0442e-f4b4-4f59-b3c5-1510ae4d792c-kube-api-access-lj6j5\") pod \"nova-cell0-db-create-qt42r\" (UID: \"21b0442e-f4b4-4f59-b3c5-1510ae4d792c\") " pod="openstack/nova-cell0-db-create-qt42r" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.851188 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21b0442e-f4b4-4f59-b3c5-1510ae4d792c-operator-scripts\") pod \"nova-cell0-db-create-qt42r\" (UID: \"21b0442e-f4b4-4f59-b3c5-1510ae4d792c\") " pod="openstack/nova-cell0-db-create-qt42r" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.851210 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z52tq\" (UniqueName: \"kubernetes.io/projected/a069955e-f546-4522-97ec-5a529f79b1aa-kube-api-access-z52tq\") pod \"nova-api-db-create-zj224\" (UID: \"a069955e-f546-4522-97ec-5a529f79b1aa\") " pod="openstack/nova-api-db-create-zj224" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.852122 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a069955e-f546-4522-97ec-5a529f79b1aa-operator-scripts\") pod \"nova-api-db-create-zj224\" (UID: \"a069955e-f546-4522-97ec-5a529f79b1aa\") " pod="openstack/nova-api-db-create-zj224" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.868138 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-474a-account-create-update-dlgkl"] Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.890737 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z52tq\" (UniqueName: \"kubernetes.io/projected/a069955e-f546-4522-97ec-5a529f79b1aa-kube-api-access-z52tq\") pod \"nova-api-db-create-zj224\" (UID: \"a069955e-f546-4522-97ec-5a529f79b1aa\") " pod="openstack/nova-api-db-create-zj224" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.934110 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zj224" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.959038 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj6j5\" (UniqueName: \"kubernetes.io/projected/21b0442e-f4b4-4f59-b3c5-1510ae4d792c-kube-api-access-lj6j5\") pod \"nova-cell0-db-create-qt42r\" (UID: \"21b0442e-f4b4-4f59-b3c5-1510ae4d792c\") " pod="openstack/nova-cell0-db-create-qt42r" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.959107 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21b0442e-f4b4-4f59-b3c5-1510ae4d792c-operator-scripts\") pod \"nova-cell0-db-create-qt42r\" (UID: \"21b0442e-f4b4-4f59-b3c5-1510ae4d792c\") " pod="openstack/nova-cell0-db-create-qt42r" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.959148 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztqnk\" (UniqueName: \"kubernetes.io/projected/f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe-kube-api-access-ztqnk\") pod \"nova-cell1-db-create-8f2r2\" (UID: \"f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe\") " pod="openstack/nova-cell1-db-create-8f2r2" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.959182 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe-operator-scripts\") pod \"nova-cell1-db-create-8f2r2\" (UID: \"f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe\") " pod="openstack/nova-cell1-db-create-8f2r2" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.959240 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw6bw\" (UniqueName: \"kubernetes.io/projected/09b4c86e-31ba-4d91-a602-39fa3a57c798-kube-api-access-fw6bw\") pod \"nova-api-474a-account-create-update-dlgkl\" (UID: \"09b4c86e-31ba-4d91-a602-39fa3a57c798\") " pod="openstack/nova-api-474a-account-create-update-dlgkl" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.959294 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09b4c86e-31ba-4d91-a602-39fa3a57c798-operator-scripts\") pod \"nova-api-474a-account-create-update-dlgkl\" (UID: \"09b4c86e-31ba-4d91-a602-39fa3a57c798\") " pod="openstack/nova-api-474a-account-create-update-dlgkl" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.960357 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21b0442e-f4b4-4f59-b3c5-1510ae4d792c-operator-scripts\") pod \"nova-cell0-db-create-qt42r\" (UID: \"21b0442e-f4b4-4f59-b3c5-1510ae4d792c\") " pod="openstack/nova-cell0-db-create-qt42r" Mar 01 09:29:30 crc kubenswrapper[4792]: I0301 09:29:30.983033 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj6j5\" (UniqueName: \"kubernetes.io/projected/21b0442e-f4b4-4f59-b3c5-1510ae4d792c-kube-api-access-lj6j5\") pod \"nova-cell0-db-create-qt42r\" (UID: \"21b0442e-f4b4-4f59-b3c5-1510ae4d792c\") " pod="openstack/nova-cell0-db-create-qt42r" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.025447 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-3b7a-account-create-update-2vc26"] Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.026703 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3b7a-account-create-update-2vc26" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.029124 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.051746 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qt42r" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.067664 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztqnk\" (UniqueName: \"kubernetes.io/projected/f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe-kube-api-access-ztqnk\") pod \"nova-cell1-db-create-8f2r2\" (UID: \"f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe\") " pod="openstack/nova-cell1-db-create-8f2r2" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.067725 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe-operator-scripts\") pod \"nova-cell1-db-create-8f2r2\" (UID: \"f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe\") " pod="openstack/nova-cell1-db-create-8f2r2" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.067785 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw6bw\" (UniqueName: \"kubernetes.io/projected/09b4c86e-31ba-4d91-a602-39fa3a57c798-kube-api-access-fw6bw\") pod \"nova-api-474a-account-create-update-dlgkl\" (UID: \"09b4c86e-31ba-4d91-a602-39fa3a57c798\") " pod="openstack/nova-api-474a-account-create-update-dlgkl" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.067831 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09b4c86e-31ba-4d91-a602-39fa3a57c798-operator-scripts\") pod \"nova-api-474a-account-create-update-dlgkl\" (UID: \"09b4c86e-31ba-4d91-a602-39fa3a57c798\") " pod="openstack/nova-api-474a-account-create-update-dlgkl" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.069688 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09b4c86e-31ba-4d91-a602-39fa3a57c798-operator-scripts\") pod \"nova-api-474a-account-create-update-dlgkl\" (UID: \"09b4c86e-31ba-4d91-a602-39fa3a57c798\") " pod="openstack/nova-api-474a-account-create-update-dlgkl" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.076864 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe-operator-scripts\") pod \"nova-cell1-db-create-8f2r2\" (UID: \"f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe\") " pod="openstack/nova-cell1-db-create-8f2r2" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.099103 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-3b7a-account-create-update-2vc26"] Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.109829 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"594710f1-32aa-4acc-a8ea-8cfec7b2c28c","Type":"ContainerStarted","Data":"20059357b7d0dcf77c24de3fd3876e0d7701b71cfa87b07f6aaaa2491b33046c"} Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.118153 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw6bw\" (UniqueName: \"kubernetes.io/projected/09b4c86e-31ba-4d91-a602-39fa3a57c798-kube-api-access-fw6bw\") pod \"nova-api-474a-account-create-update-dlgkl\" (UID: \"09b4c86e-31ba-4d91-a602-39fa3a57c798\") " pod="openstack/nova-api-474a-account-create-update-dlgkl" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.145212 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztqnk\" (UniqueName: \"kubernetes.io/projected/f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe-kube-api-access-ztqnk\") pod \"nova-cell1-db-create-8f2r2\" (UID: \"f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe\") " pod="openstack/nova-cell1-db-create-8f2r2" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.149979 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-474a-account-create-update-dlgkl" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.173893 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkcdk\" (UniqueName: \"kubernetes.io/projected/e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2-kube-api-access-dkcdk\") pod \"nova-cell0-3b7a-account-create-update-2vc26\" (UID: \"e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2\") " pod="openstack/nova-cell0-3b7a-account-create-update-2vc26" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.173964 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2-operator-scripts\") pod \"nova-cell0-3b7a-account-create-update-2vc26\" (UID: \"e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2\") " pod="openstack/nova-cell0-3b7a-account-create-update-2vc26" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.255221 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-c72f-account-create-update-x9vvj"] Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.258185 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c72f-account-create-update-x9vvj" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.273315 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.275029 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkcdk\" (UniqueName: \"kubernetes.io/projected/e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2-kube-api-access-dkcdk\") pod \"nova-cell0-3b7a-account-create-update-2vc26\" (UID: \"e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2\") " pod="openstack/nova-cell0-3b7a-account-create-update-2vc26" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.275062 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2-operator-scripts\") pod \"nova-cell0-3b7a-account-create-update-2vc26\" (UID: \"e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2\") " pod="openstack/nova-cell0-3b7a-account-create-update-2vc26" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.275749 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2-operator-scripts\") pod \"nova-cell0-3b7a-account-create-update-2vc26\" (UID: \"e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2\") " pod="openstack/nova-cell0-3b7a-account-create-update-2vc26" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.303287 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-c72f-account-create-update-x9vvj"] Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.314456 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkcdk\" (UniqueName: \"kubernetes.io/projected/e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2-kube-api-access-dkcdk\") pod \"nova-cell0-3b7a-account-create-update-2vc26\" (UID: \"e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2\") " pod="openstack/nova-cell0-3b7a-account-create-update-2vc26" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.353414 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3b7a-account-create-update-2vc26" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.377093 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2be4f49-c20a-4e25-bff3-e4617d275fa1-operator-scripts\") pod \"nova-cell1-c72f-account-create-update-x9vvj\" (UID: \"f2be4f49-c20a-4e25-bff3-e4617d275fa1\") " pod="openstack/nova-cell1-c72f-account-create-update-x9vvj" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.377266 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dgpl\" (UniqueName: \"kubernetes.io/projected/f2be4f49-c20a-4e25-bff3-e4617d275fa1-kube-api-access-9dgpl\") pod \"nova-cell1-c72f-account-create-update-x9vvj\" (UID: \"f2be4f49-c20a-4e25-bff3-e4617d275fa1\") " pod="openstack/nova-cell1-c72f-account-create-update-x9vvj" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.432525 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8f2r2" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.480145 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dgpl\" (UniqueName: \"kubernetes.io/projected/f2be4f49-c20a-4e25-bff3-e4617d275fa1-kube-api-access-9dgpl\") pod \"nova-cell1-c72f-account-create-update-x9vvj\" (UID: \"f2be4f49-c20a-4e25-bff3-e4617d275fa1\") " pod="openstack/nova-cell1-c72f-account-create-update-x9vvj" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.480252 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2be4f49-c20a-4e25-bff3-e4617d275fa1-operator-scripts\") pod \"nova-cell1-c72f-account-create-update-x9vvj\" (UID: \"f2be4f49-c20a-4e25-bff3-e4617d275fa1\") " pod="openstack/nova-cell1-c72f-account-create-update-x9vvj" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.481494 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2be4f49-c20a-4e25-bff3-e4617d275fa1-operator-scripts\") pod \"nova-cell1-c72f-account-create-update-x9vvj\" (UID: \"f2be4f49-c20a-4e25-bff3-e4617d275fa1\") " pod="openstack/nova-cell1-c72f-account-create-update-x9vvj" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.500672 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dgpl\" (UniqueName: \"kubernetes.io/projected/f2be4f49-c20a-4e25-bff3-e4617d275fa1-kube-api-access-9dgpl\") pod \"nova-cell1-c72f-account-create-update-x9vvj\" (UID: \"f2be4f49-c20a-4e25-bff3-e4617d275fa1\") " pod="openstack/nova-cell1-c72f-account-create-update-x9vvj" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.576424 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.669614 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c72f-account-create-update-x9vvj" Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.748819 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-zj224"] Mar 01 09:29:31 crc kubenswrapper[4792]: I0301 09:29:31.908687 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-qt42r"] Mar 01 09:29:32 crc kubenswrapper[4792]: I0301 09:29:32.075091 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-474a-account-create-update-dlgkl"] Mar 01 09:29:32 crc kubenswrapper[4792]: I0301 09:29:32.195764 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qt42r" event={"ID":"21b0442e-f4b4-4f59-b3c5-1510ae4d792c","Type":"ContainerStarted","Data":"5ac29da5fa52816d6af4f4e774013cba308601b08c0ed36851caa33072f6b41d"} Mar 01 09:29:32 crc kubenswrapper[4792]: I0301 09:29:32.222070 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zj224" event={"ID":"a069955e-f546-4522-97ec-5a529f79b1aa","Type":"ContainerStarted","Data":"82dc2fec75535cf7d5ba98c257213dcf0b978f1770576dc67153fee7dc3473af"} Mar 01 09:29:32 crc kubenswrapper[4792]: I0301 09:29:32.222116 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zj224" event={"ID":"a069955e-f546-4522-97ec-5a529f79b1aa","Type":"ContainerStarted","Data":"035975688c011495ce93c6c1b1155e999f1292fecaac7c3671be568df507bd87"} Mar 01 09:29:32 crc kubenswrapper[4792]: I0301 09:29:32.239692 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-3b7a-account-create-update-2vc26"] Mar 01 09:29:32 crc kubenswrapper[4792]: I0301 09:29:32.263860 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-zj224" podStartSLOduration=2.263840258 podStartE2EDuration="2.263840258s" podCreationTimestamp="2026-03-01 09:29:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:29:32.258270465 +0000 UTC m=+1301.500149682" watchObservedRunningTime="2026-03-01 09:29:32.263840258 +0000 UTC m=+1301.505719455" Mar 01 09:29:32 crc kubenswrapper[4792]: I0301 09:29:32.263973 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-8f2r2"] Mar 01 09:29:32 crc kubenswrapper[4792]: I0301 09:29:32.283619 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"594710f1-32aa-4acc-a8ea-8cfec7b2c28c","Type":"ContainerStarted","Data":"2974b52b9fe70c71e1067f1e48fbfe7f66447a69916e1c15a62b5a2377bf5549"} Mar 01 09:29:32 crc kubenswrapper[4792]: I0301 09:29:32.481540 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-c72f-account-create-update-x9vvj"] Mar 01 09:29:33 crc kubenswrapper[4792]: I0301 09:29:33.312693 4792 generic.go:334] "Generic (PLEG): container finished" podID="21b0442e-f4b4-4f59-b3c5-1510ae4d792c" containerID="7a8d1321567d66f2ccb1955a5edf06d8800b1b50205fae01644b45e7fa573653" exitCode=0 Mar 01 09:29:33 crc kubenswrapper[4792]: I0301 09:29:33.312885 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qt42r" event={"ID":"21b0442e-f4b4-4f59-b3c5-1510ae4d792c","Type":"ContainerDied","Data":"7a8d1321567d66f2ccb1955a5edf06d8800b1b50205fae01644b45e7fa573653"} Mar 01 09:29:33 crc kubenswrapper[4792]: I0301 09:29:33.319474 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8f2r2" event={"ID":"f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe","Type":"ContainerStarted","Data":"dffb4e0c554e65ca661a311e46e8c09863ee0d9c8fb1783c221a0323e637a5f5"} Mar 01 09:29:33 crc kubenswrapper[4792]: I0301 09:29:33.319512 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8f2r2" event={"ID":"f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe","Type":"ContainerStarted","Data":"dadc3c08b41a7bc9b02729db8b51278d209590e1ed422026617fbb3c1754363b"} Mar 01 09:29:33 crc kubenswrapper[4792]: I0301 09:29:33.323563 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-474a-account-create-update-dlgkl" event={"ID":"09b4c86e-31ba-4d91-a602-39fa3a57c798","Type":"ContainerStarted","Data":"92439593a89d55067268c29652937630710da016f1c0b75141189d39ceeced86"} Mar 01 09:29:33 crc kubenswrapper[4792]: I0301 09:29:33.323621 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-474a-account-create-update-dlgkl" event={"ID":"09b4c86e-31ba-4d91-a602-39fa3a57c798","Type":"ContainerStarted","Data":"3feefdb24fc4187fb81280730c9fa62f502ce358f1aaf7380b427d95a1866213"} Mar 01 09:29:33 crc kubenswrapper[4792]: I0301 09:29:33.330383 4792 generic.go:334] "Generic (PLEG): container finished" podID="a069955e-f546-4522-97ec-5a529f79b1aa" containerID="82dc2fec75535cf7d5ba98c257213dcf0b978f1770576dc67153fee7dc3473af" exitCode=0 Mar 01 09:29:33 crc kubenswrapper[4792]: I0301 09:29:33.330456 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zj224" event={"ID":"a069955e-f546-4522-97ec-5a529f79b1aa","Type":"ContainerDied","Data":"82dc2fec75535cf7d5ba98c257213dcf0b978f1770576dc67153fee7dc3473af"} Mar 01 09:29:33 crc kubenswrapper[4792]: I0301 09:29:33.353882 4792 generic.go:334] "Generic (PLEG): container finished" podID="e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2" containerID="05d5887b441a9b375453d0ad6f9bd8826e5d3d116043c948abf3e299df007d6e" exitCode=0 Mar 01 09:29:33 crc kubenswrapper[4792]: I0301 09:29:33.353993 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3b7a-account-create-update-2vc26" event={"ID":"e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2","Type":"ContainerDied","Data":"05d5887b441a9b375453d0ad6f9bd8826e5d3d116043c948abf3e299df007d6e"} Mar 01 09:29:33 crc kubenswrapper[4792]: I0301 09:29:33.354018 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3b7a-account-create-update-2vc26" event={"ID":"e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2","Type":"ContainerStarted","Data":"e15a28a43dd6c1126f3b22e8480bf4ba4434e945130f45feec86bf90e7913964"} Mar 01 09:29:33 crc kubenswrapper[4792]: I0301 09:29:33.397343 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c72f-account-create-update-x9vvj" event={"ID":"f2be4f49-c20a-4e25-bff3-e4617d275fa1","Type":"ContainerStarted","Data":"d36c7b1a79f4c4f6b03c61a96da47d8288af453f4f9225ccf2c3d4099f44d0df"} Mar 01 09:29:33 crc kubenswrapper[4792]: I0301 09:29:33.398807 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c72f-account-create-update-x9vvj" event={"ID":"f2be4f49-c20a-4e25-bff3-e4617d275fa1","Type":"ContainerStarted","Data":"771711ce9c2752c3012e04bcade441dfda2723a81435feaa4ce6f2649bebee72"} Mar 01 09:29:33 crc kubenswrapper[4792]: I0301 09:29:33.402016 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-8f2r2" podStartSLOduration=3.401996338 podStartE2EDuration="3.401996338s" podCreationTimestamp="2026-03-01 09:29:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:29:33.38038928 +0000 UTC m=+1302.622268477" watchObservedRunningTime="2026-03-01 09:29:33.401996338 +0000 UTC m=+1302.643875535" Mar 01 09:29:33 crc kubenswrapper[4792]: I0301 09:29:33.447855 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-474a-account-create-update-dlgkl" podStartSLOduration=3.447836667 podStartE2EDuration="3.447836667s" podCreationTimestamp="2026-03-01 09:29:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:29:33.423159266 +0000 UTC m=+1302.665038463" watchObservedRunningTime="2026-03-01 09:29:33.447836667 +0000 UTC m=+1302.689715864" Mar 01 09:29:33 crc kubenswrapper[4792]: I0301 09:29:33.489433 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"594710f1-32aa-4acc-a8ea-8cfec7b2c28c","Type":"ContainerStarted","Data":"f1c336e0ebe268dd9ca94d30a4f01dbdf7925917a75d6e87aa3514ba966658df"} Mar 01 09:29:33 crc kubenswrapper[4792]: I0301 09:29:33.500352 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-c72f-account-create-update-x9vvj" podStartSLOduration=2.500334086 podStartE2EDuration="2.500334086s" podCreationTimestamp="2026-03-01 09:29:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:29:33.466582467 +0000 UTC m=+1302.708461664" watchObservedRunningTime="2026-03-01 09:29:33.500334086 +0000 UTC m=+1302.742213283" Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.437283 4792 generic.go:334] "Generic (PLEG): container finished" podID="f2be4f49-c20a-4e25-bff3-e4617d275fa1" containerID="d36c7b1a79f4c4f6b03c61a96da47d8288af453f4f9225ccf2c3d4099f44d0df" exitCode=0 Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.437648 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c72f-account-create-update-x9vvj" event={"ID":"f2be4f49-c20a-4e25-bff3-e4617d275fa1","Type":"ContainerDied","Data":"d36c7b1a79f4c4f6b03c61a96da47d8288af453f4f9225ccf2c3d4099f44d0df"} Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.439944 4792 generic.go:334] "Generic (PLEG): container finished" podID="f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe" containerID="dffb4e0c554e65ca661a311e46e8c09863ee0d9c8fb1783c221a0323e637a5f5" exitCode=0 Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.440004 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8f2r2" event={"ID":"f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe","Type":"ContainerDied","Data":"dffb4e0c554e65ca661a311e46e8c09863ee0d9c8fb1783c221a0323e637a5f5"} Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.443785 4792 generic.go:334] "Generic (PLEG): container finished" podID="09b4c86e-31ba-4d91-a602-39fa3a57c798" containerID="92439593a89d55067268c29652937630710da016f1c0b75141189d39ceeced86" exitCode=0 Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.443856 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-474a-account-create-update-dlgkl" event={"ID":"09b4c86e-31ba-4d91-a602-39fa3a57c798","Type":"ContainerDied","Data":"92439593a89d55067268c29652937630710da016f1c0b75141189d39ceeced86"} Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.446086 4792 generic.go:334] "Generic (PLEG): container finished" podID="013566fd-5627-422a-809a-e81a8ec059d9" containerID="ca55616f2e5de805229eb50d3f643de3b33e4b53039ccf7569dc0337fc8e14a5" exitCode=0 Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.446188 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bbc5b86d6-8b672" event={"ID":"013566fd-5627-422a-809a-e81a8ec059d9","Type":"ContainerDied","Data":"ca55616f2e5de805229eb50d3f643de3b33e4b53039ccf7569dc0337fc8e14a5"} Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.703678 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.820086 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3b7a-account-create-update-2vc26" Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.882879 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-ovndb-tls-certs\") pod \"013566fd-5627-422a-809a-e81a8ec059d9\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.883006 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-httpd-config\") pod \"013566fd-5627-422a-809a-e81a8ec059d9\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.883093 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-combined-ca-bundle\") pod \"013566fd-5627-422a-809a-e81a8ec059d9\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.883236 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsctq\" (UniqueName: \"kubernetes.io/projected/013566fd-5627-422a-809a-e81a8ec059d9-kube-api-access-hsctq\") pod \"013566fd-5627-422a-809a-e81a8ec059d9\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.883263 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-config\") pod \"013566fd-5627-422a-809a-e81a8ec059d9\" (UID: \"013566fd-5627-422a-809a-e81a8ec059d9\") " Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.897417 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/013566fd-5627-422a-809a-e81a8ec059d9-kube-api-access-hsctq" (OuterVolumeSpecName: "kube-api-access-hsctq") pod "013566fd-5627-422a-809a-e81a8ec059d9" (UID: "013566fd-5627-422a-809a-e81a8ec059d9"). InnerVolumeSpecName "kube-api-access-hsctq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.900827 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "013566fd-5627-422a-809a-e81a8ec059d9" (UID: "013566fd-5627-422a-809a-e81a8ec059d9"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.960515 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.960584 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.985234 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2-operator-scripts\") pod \"e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2\" (UID: \"e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2\") " Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.985323 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkcdk\" (UniqueName: \"kubernetes.io/projected/e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2-kube-api-access-dkcdk\") pod \"e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2\" (UID: \"e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2\") " Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.985761 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsctq\" (UniqueName: \"kubernetes.io/projected/013566fd-5627-422a-809a-e81a8ec059d9-kube-api-access-hsctq\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.985775 4792 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:34 crc kubenswrapper[4792]: I0301 09:29:34.985962 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2" (UID: "e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.007707 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2-kube-api-access-dkcdk" (OuterVolumeSpecName: "kube-api-access-dkcdk") pod "e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2" (UID: "e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2"). InnerVolumeSpecName "kube-api-access-dkcdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.049465 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-config" (OuterVolumeSpecName: "config") pod "013566fd-5627-422a-809a-e81a8ec059d9" (UID: "013566fd-5627-422a-809a-e81a8ec059d9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.050080 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "013566fd-5627-422a-809a-e81a8ec059d9" (UID: "013566fd-5627-422a-809a-e81a8ec059d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.076477 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zj224" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.081543 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qt42r" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.086967 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.086994 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.087003 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkcdk\" (UniqueName: \"kubernetes.io/projected/e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2-kube-api-access-dkcdk\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.087014 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.089034 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "013566fd-5627-422a-809a-e81a8ec059d9" (UID: "013566fd-5627-422a-809a-e81a8ec059d9"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.188266 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lj6j5\" (UniqueName: \"kubernetes.io/projected/21b0442e-f4b4-4f59-b3c5-1510ae4d792c-kube-api-access-lj6j5\") pod \"21b0442e-f4b4-4f59-b3c5-1510ae4d792c\" (UID: \"21b0442e-f4b4-4f59-b3c5-1510ae4d792c\") " Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.188362 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21b0442e-f4b4-4f59-b3c5-1510ae4d792c-operator-scripts\") pod \"21b0442e-f4b4-4f59-b3c5-1510ae4d792c\" (UID: \"21b0442e-f4b4-4f59-b3c5-1510ae4d792c\") " Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.188436 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z52tq\" (UniqueName: \"kubernetes.io/projected/a069955e-f546-4522-97ec-5a529f79b1aa-kube-api-access-z52tq\") pod \"a069955e-f546-4522-97ec-5a529f79b1aa\" (UID: \"a069955e-f546-4522-97ec-5a529f79b1aa\") " Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.188515 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a069955e-f546-4522-97ec-5a529f79b1aa-operator-scripts\") pod \"a069955e-f546-4522-97ec-5a529f79b1aa\" (UID: \"a069955e-f546-4522-97ec-5a529f79b1aa\") " Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.189083 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21b0442e-f4b4-4f59-b3c5-1510ae4d792c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "21b0442e-f4b4-4f59-b3c5-1510ae4d792c" (UID: "21b0442e-f4b4-4f59-b3c5-1510ae4d792c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.189089 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a069955e-f546-4522-97ec-5a529f79b1aa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a069955e-f546-4522-97ec-5a529f79b1aa" (UID: "a069955e-f546-4522-97ec-5a529f79b1aa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.189398 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21b0442e-f4b4-4f59-b3c5-1510ae4d792c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.189418 4792 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/013566fd-5627-422a-809a-e81a8ec059d9-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.189427 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a069955e-f546-4522-97ec-5a529f79b1aa-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.191590 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a069955e-f546-4522-97ec-5a529f79b1aa-kube-api-access-z52tq" (OuterVolumeSpecName: "kube-api-access-z52tq") pod "a069955e-f546-4522-97ec-5a529f79b1aa" (UID: "a069955e-f546-4522-97ec-5a529f79b1aa"). InnerVolumeSpecName "kube-api-access-z52tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.195028 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21b0442e-f4b4-4f59-b3c5-1510ae4d792c-kube-api-access-lj6j5" (OuterVolumeSpecName: "kube-api-access-lj6j5") pod "21b0442e-f4b4-4f59-b3c5-1510ae4d792c" (UID: "21b0442e-f4b4-4f59-b3c5-1510ae4d792c"). InnerVolumeSpecName "kube-api-access-lj6j5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.291456 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lj6j5\" (UniqueName: \"kubernetes.io/projected/21b0442e-f4b4-4f59-b3c5-1510ae4d792c-kube-api-access-lj6j5\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.291700 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z52tq\" (UniqueName: \"kubernetes.io/projected/a069955e-f546-4522-97ec-5a529f79b1aa-kube-api-access-z52tq\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.454774 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qt42r" event={"ID":"21b0442e-f4b4-4f59-b3c5-1510ae4d792c","Type":"ContainerDied","Data":"5ac29da5fa52816d6af4f4e774013cba308601b08c0ed36851caa33072f6b41d"} Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.454800 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qt42r" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.454805 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ac29da5fa52816d6af4f4e774013cba308601b08c0ed36851caa33072f6b41d" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.456520 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zj224" event={"ID":"a069955e-f546-4522-97ec-5a529f79b1aa","Type":"ContainerDied","Data":"035975688c011495ce93c6c1b1155e999f1292fecaac7c3671be568df507bd87"} Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.456568 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="035975688c011495ce93c6c1b1155e999f1292fecaac7c3671be568df507bd87" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.456537 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zj224" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.458789 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3b7a-account-create-update-2vc26" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.458842 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3b7a-account-create-update-2vc26" event={"ID":"e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2","Type":"ContainerDied","Data":"e15a28a43dd6c1126f3b22e8480bf4ba4434e945130f45feec86bf90e7913964"} Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.458877 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e15a28a43dd6c1126f3b22e8480bf4ba4434e945130f45feec86bf90e7913964" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.462351 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bbc5b86d6-8b672" event={"ID":"013566fd-5627-422a-809a-e81a8ec059d9","Type":"ContainerDied","Data":"120df1c67b7935983b4052512da03cb57b70749fae0a4306db0f53d6bed8199c"} Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.462389 4792 scope.go:117] "RemoveContainer" containerID="2f7d0d1b6918c9e962de8e492d201469a86475bd623f4086bd8a9e1f30a74d63" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.462544 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7bbc5b86d6-8b672" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.471477 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerName="ceilometer-central-agent" containerID="cri-o://20059357b7d0dcf77c24de3fd3876e0d7701b71cfa87b07f6aaaa2491b33046c" gracePeriod=30 Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.471662 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"594710f1-32aa-4acc-a8ea-8cfec7b2c28c","Type":"ContainerStarted","Data":"387825e4421b63de17f8a95a35da8706ef3e20db6f2a3819f7e3c7def308bb9b"} Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.471919 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.471982 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerName="proxy-httpd" containerID="cri-o://387825e4421b63de17f8a95a35da8706ef3e20db6f2a3819f7e3c7def308bb9b" gracePeriod=30 Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.472028 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerName="sg-core" containerID="cri-o://f1c336e0ebe268dd9ca94d30a4f01dbdf7925917a75d6e87aa3514ba966658df" gracePeriod=30 Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.472091 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerName="ceilometer-notification-agent" containerID="cri-o://2974b52b9fe70c71e1067f1e48fbfe7f66447a69916e1c15a62b5a2377bf5549" gracePeriod=30 Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.499733 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7bbc5b86d6-8b672"] Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.503860 4792 scope.go:117] "RemoveContainer" containerID="ca55616f2e5de805229eb50d3f643de3b33e4b53039ccf7569dc0337fc8e14a5" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.509500 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7bbc5b86d6-8b672"] Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.526188 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.19990666 podStartE2EDuration="7.526024389s" podCreationTimestamp="2026-03-01 09:29:28 +0000 UTC" firstStartedPulling="2026-03-01 09:29:28.978710237 +0000 UTC m=+1298.220589434" lastFinishedPulling="2026-03-01 09:29:34.304827966 +0000 UTC m=+1303.546707163" observedRunningTime="2026-03-01 09:29:35.519742168 +0000 UTC m=+1304.761621365" watchObservedRunningTime="2026-03-01 09:29:35.526024389 +0000 UTC m=+1304.767903586" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.912056 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c72f-account-create-update-x9vvj" Mar 01 09:29:35 crc kubenswrapper[4792]: I0301 09:29:35.994179 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8f2r2" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.000893 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-474a-account-create-update-dlgkl" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.016739 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2be4f49-c20a-4e25-bff3-e4617d275fa1-operator-scripts\") pod \"f2be4f49-c20a-4e25-bff3-e4617d275fa1\" (UID: \"f2be4f49-c20a-4e25-bff3-e4617d275fa1\") " Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.016800 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dgpl\" (UniqueName: \"kubernetes.io/projected/f2be4f49-c20a-4e25-bff3-e4617d275fa1-kube-api-access-9dgpl\") pod \"f2be4f49-c20a-4e25-bff3-e4617d275fa1\" (UID: \"f2be4f49-c20a-4e25-bff3-e4617d275fa1\") " Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.019963 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2be4f49-c20a-4e25-bff3-e4617d275fa1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f2be4f49-c20a-4e25-bff3-e4617d275fa1" (UID: "f2be4f49-c20a-4e25-bff3-e4617d275fa1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.022576 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2be4f49-c20a-4e25-bff3-e4617d275fa1-kube-api-access-9dgpl" (OuterVolumeSpecName: "kube-api-access-9dgpl") pod "f2be4f49-c20a-4e25-bff3-e4617d275fa1" (UID: "f2be4f49-c20a-4e25-bff3-e4617d275fa1"). InnerVolumeSpecName "kube-api-access-9dgpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.150072 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09b4c86e-31ba-4d91-a602-39fa3a57c798-operator-scripts\") pod \"09b4c86e-31ba-4d91-a602-39fa3a57c798\" (UID: \"09b4c86e-31ba-4d91-a602-39fa3a57c798\") " Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.150318 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztqnk\" (UniqueName: \"kubernetes.io/projected/f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe-kube-api-access-ztqnk\") pod \"f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe\" (UID: \"f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe\") " Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.150348 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe-operator-scripts\") pod \"f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe\" (UID: \"f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe\") " Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.150421 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fw6bw\" (UniqueName: \"kubernetes.io/projected/09b4c86e-31ba-4d91-a602-39fa3a57c798-kube-api-access-fw6bw\") pod \"09b4c86e-31ba-4d91-a602-39fa3a57c798\" (UID: \"09b4c86e-31ba-4d91-a602-39fa3a57c798\") " Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.151171 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2be4f49-c20a-4e25-bff3-e4617d275fa1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.151192 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dgpl\" (UniqueName: \"kubernetes.io/projected/f2be4f49-c20a-4e25-bff3-e4617d275fa1-kube-api-access-9dgpl\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.151972 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe" (UID: "f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.154311 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe-kube-api-access-ztqnk" (OuterVolumeSpecName: "kube-api-access-ztqnk") pod "f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe" (UID: "f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe"). InnerVolumeSpecName "kube-api-access-ztqnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.155247 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09b4c86e-31ba-4d91-a602-39fa3a57c798-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "09b4c86e-31ba-4d91-a602-39fa3a57c798" (UID: "09b4c86e-31ba-4d91-a602-39fa3a57c798"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.160345 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09b4c86e-31ba-4d91-a602-39fa3a57c798-kube-api-access-fw6bw" (OuterVolumeSpecName: "kube-api-access-fw6bw") pod "09b4c86e-31ba-4d91-a602-39fa3a57c798" (UID: "09b4c86e-31ba-4d91-a602-39fa3a57c798"). InnerVolumeSpecName "kube-api-access-fw6bw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.252631 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09b4c86e-31ba-4d91-a602-39fa3a57c798-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.252668 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztqnk\" (UniqueName: \"kubernetes.io/projected/f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe-kube-api-access-ztqnk\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.252679 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.252688 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fw6bw\" (UniqueName: \"kubernetes.io/projected/09b4c86e-31ba-4d91-a602-39fa3a57c798-kube-api-access-fw6bw\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.481037 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8f2r2" event={"ID":"f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe","Type":"ContainerDied","Data":"dadc3c08b41a7bc9b02729db8b51278d209590e1ed422026617fbb3c1754363b"} Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.481549 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dadc3c08b41a7bc9b02729db8b51278d209590e1ed422026617fbb3c1754363b" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.481125 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8f2r2" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.493141 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-474a-account-create-update-dlgkl" event={"ID":"09b4c86e-31ba-4d91-a602-39fa3a57c798","Type":"ContainerDied","Data":"3feefdb24fc4187fb81280730c9fa62f502ce358f1aaf7380b427d95a1866213"} Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.493183 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3feefdb24fc4187fb81280730c9fa62f502ce358f1aaf7380b427d95a1866213" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.493958 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-474a-account-create-update-dlgkl" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.499679 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c72f-account-create-update-x9vvj" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.499679 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c72f-account-create-update-x9vvj" event={"ID":"f2be4f49-c20a-4e25-bff3-e4617d275fa1","Type":"ContainerDied","Data":"771711ce9c2752c3012e04bcade441dfda2723a81435feaa4ce6f2649bebee72"} Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.499889 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="771711ce9c2752c3012e04bcade441dfda2723a81435feaa4ce6f2649bebee72" Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.508396 4792 generic.go:334] "Generic (PLEG): container finished" podID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerID="387825e4421b63de17f8a95a35da8706ef3e20db6f2a3819f7e3c7def308bb9b" exitCode=0 Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.508423 4792 generic.go:334] "Generic (PLEG): container finished" podID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerID="f1c336e0ebe268dd9ca94d30a4f01dbdf7925917a75d6e87aa3514ba966658df" exitCode=2 Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.508431 4792 generic.go:334] "Generic (PLEG): container finished" podID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerID="2974b52b9fe70c71e1067f1e48fbfe7f66447a69916e1c15a62b5a2377bf5549" exitCode=0 Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.508448 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"594710f1-32aa-4acc-a8ea-8cfec7b2c28c","Type":"ContainerDied","Data":"387825e4421b63de17f8a95a35da8706ef3e20db6f2a3819f7e3c7def308bb9b"} Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.508468 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"594710f1-32aa-4acc-a8ea-8cfec7b2c28c","Type":"ContainerDied","Data":"f1c336e0ebe268dd9ca94d30a4f01dbdf7925917a75d6e87aa3514ba966658df"} Mar 01 09:29:36 crc kubenswrapper[4792]: I0301 09:29:36.508478 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"594710f1-32aa-4acc-a8ea-8cfec7b2c28c","Type":"ContainerDied","Data":"2974b52b9fe70c71e1067f1e48fbfe7f66447a69916e1c15a62b5a2377bf5549"} Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.421587 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="013566fd-5627-422a-809a-e81a8ec059d9" path="/var/lib/kubelet/pods/013566fd-5627-422a-809a-e81a8ec059d9/volumes" Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.526240 4792 generic.go:334] "Generic (PLEG): container finished" podID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerID="20059357b7d0dcf77c24de3fd3876e0d7701b71cfa87b07f6aaaa2491b33046c" exitCode=0 Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.526283 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"594710f1-32aa-4acc-a8ea-8cfec7b2c28c","Type":"ContainerDied","Data":"20059357b7d0dcf77c24de3fd3876e0d7701b71cfa87b07f6aaaa2491b33046c"} Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.781587 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.879668 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptvq9\" (UniqueName: \"kubernetes.io/projected/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-kube-api-access-ptvq9\") pod \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.879777 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-combined-ca-bundle\") pod \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.879929 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-scripts\") pod \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.879948 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-sg-core-conf-yaml\") pod \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.880050 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-config-data\") pod \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.880554 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-log-httpd\") pod \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.880612 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-run-httpd\") pod \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\" (UID: \"594710f1-32aa-4acc-a8ea-8cfec7b2c28c\") " Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.880836 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "594710f1-32aa-4acc-a8ea-8cfec7b2c28c" (UID: "594710f1-32aa-4acc-a8ea-8cfec7b2c28c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.881198 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.882685 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "594710f1-32aa-4acc-a8ea-8cfec7b2c28c" (UID: "594710f1-32aa-4acc-a8ea-8cfec7b2c28c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.885638 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-scripts" (OuterVolumeSpecName: "scripts") pod "594710f1-32aa-4acc-a8ea-8cfec7b2c28c" (UID: "594710f1-32aa-4acc-a8ea-8cfec7b2c28c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.911590 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-kube-api-access-ptvq9" (OuterVolumeSpecName: "kube-api-access-ptvq9") pod "594710f1-32aa-4acc-a8ea-8cfec7b2c28c" (UID: "594710f1-32aa-4acc-a8ea-8cfec7b2c28c"). InnerVolumeSpecName "kube-api-access-ptvq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.951881 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "594710f1-32aa-4acc-a8ea-8cfec7b2c28c" (UID: "594710f1-32aa-4acc-a8ea-8cfec7b2c28c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.980989 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "594710f1-32aa-4acc-a8ea-8cfec7b2c28c" (UID: "594710f1-32aa-4acc-a8ea-8cfec7b2c28c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.983045 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.983245 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.983329 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.983387 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:37 crc kubenswrapper[4792]: I0301 09:29:37.983591 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptvq9\" (UniqueName: \"kubernetes.io/projected/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-kube-api-access-ptvq9\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.015729 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-config-data" (OuterVolumeSpecName: "config-data") pod "594710f1-32aa-4acc-a8ea-8cfec7b2c28c" (UID: "594710f1-32aa-4acc-a8ea-8cfec7b2c28c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.072316 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.085989 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594710f1-32aa-4acc-a8ea-8cfec7b2c28c-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.208326 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-84f9696594-qdwsv" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.284206 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7f7447dcd6-cpnn5"] Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.284532 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7f7447dcd6-cpnn5" podUID="947b32da-5664-42ff-a665-ac182dea1433" containerName="placement-log" containerID="cri-o://4555af75d6b8f403b02d065ff405189b215772f534b2956e72f7441d250ba2de" gracePeriod=30 Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.284681 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7f7447dcd6-cpnn5" podUID="947b32da-5664-42ff-a665-ac182dea1433" containerName="placement-api" containerID="cri-o://9288e4694d44c535f5db6a3588b5f57161da3432c7fdcf10f8abb56af7bd4be9" gracePeriod=30 Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.537066 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"594710f1-32aa-4acc-a8ea-8cfec7b2c28c","Type":"ContainerDied","Data":"a26311f7480277741aeeb123ba5ce9e74cfc4cc3c5c9582b395475f7202f8016"} Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.537122 4792 scope.go:117] "RemoveContainer" containerID="387825e4421b63de17f8a95a35da8706ef3e20db6f2a3819f7e3c7def308bb9b" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.537130 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.540539 4792 generic.go:334] "Generic (PLEG): container finished" podID="947b32da-5664-42ff-a665-ac182dea1433" containerID="4555af75d6b8f403b02d065ff405189b215772f534b2956e72f7441d250ba2de" exitCode=143 Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.540790 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f7447dcd6-cpnn5" event={"ID":"947b32da-5664-42ff-a665-ac182dea1433","Type":"ContainerDied","Data":"4555af75d6b8f403b02d065ff405189b215772f534b2956e72f7441d250ba2de"} Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.568248 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.577619 4792 scope.go:117] "RemoveContainer" containerID="f1c336e0ebe268dd9ca94d30a4f01dbdf7925917a75d6e87aa3514ba966658df" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.586556 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.615592 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:38 crc kubenswrapper[4792]: E0301 09:29:38.616006 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21b0442e-f4b4-4f59-b3c5-1510ae4d792c" containerName="mariadb-database-create" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616022 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="21b0442e-f4b4-4f59-b3c5-1510ae4d792c" containerName="mariadb-database-create" Mar 01 09:29:38 crc kubenswrapper[4792]: E0301 09:29:38.616032 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerName="sg-core" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616037 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerName="sg-core" Mar 01 09:29:38 crc kubenswrapper[4792]: E0301 09:29:38.616049 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe" containerName="mariadb-database-create" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616056 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe" containerName="mariadb-database-create" Mar 01 09:29:38 crc kubenswrapper[4792]: E0301 09:29:38.616064 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2" containerName="mariadb-account-create-update" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616069 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2" containerName="mariadb-account-create-update" Mar 01 09:29:38 crc kubenswrapper[4792]: E0301 09:29:38.616086 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerName="ceilometer-notification-agent" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616092 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerName="ceilometer-notification-agent" Mar 01 09:29:38 crc kubenswrapper[4792]: E0301 09:29:38.616108 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2be4f49-c20a-4e25-bff3-e4617d275fa1" containerName="mariadb-account-create-update" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616114 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2be4f49-c20a-4e25-bff3-e4617d275fa1" containerName="mariadb-account-create-update" Mar 01 09:29:38 crc kubenswrapper[4792]: E0301 09:29:38.616123 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerName="ceilometer-central-agent" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616129 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerName="ceilometer-central-agent" Mar 01 09:29:38 crc kubenswrapper[4792]: E0301 09:29:38.616140 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="013566fd-5627-422a-809a-e81a8ec059d9" containerName="neutron-httpd" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616147 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="013566fd-5627-422a-809a-e81a8ec059d9" containerName="neutron-httpd" Mar 01 09:29:38 crc kubenswrapper[4792]: E0301 09:29:38.616158 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerName="proxy-httpd" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616163 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerName="proxy-httpd" Mar 01 09:29:38 crc kubenswrapper[4792]: E0301 09:29:38.616172 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="013566fd-5627-422a-809a-e81a8ec059d9" containerName="neutron-api" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616178 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="013566fd-5627-422a-809a-e81a8ec059d9" containerName="neutron-api" Mar 01 09:29:38 crc kubenswrapper[4792]: E0301 09:29:38.616185 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09b4c86e-31ba-4d91-a602-39fa3a57c798" containerName="mariadb-account-create-update" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616191 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="09b4c86e-31ba-4d91-a602-39fa3a57c798" containerName="mariadb-account-create-update" Mar 01 09:29:38 crc kubenswrapper[4792]: E0301 09:29:38.616205 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a069955e-f546-4522-97ec-5a529f79b1aa" containerName="mariadb-database-create" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616211 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a069955e-f546-4522-97ec-5a529f79b1aa" containerName="mariadb-database-create" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616363 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="09b4c86e-31ba-4d91-a602-39fa3a57c798" containerName="mariadb-account-create-update" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616394 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerName="ceilometer-central-agent" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616408 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerName="proxy-httpd" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616419 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="21b0442e-f4b4-4f59-b3c5-1510ae4d792c" containerName="mariadb-database-create" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616433 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2" containerName="mariadb-account-create-update" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616446 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2be4f49-c20a-4e25-bff3-e4617d275fa1" containerName="mariadb-account-create-update" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616453 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="013566fd-5627-422a-809a-e81a8ec059d9" containerName="neutron-httpd" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616468 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="013566fd-5627-422a-809a-e81a8ec059d9" containerName="neutron-api" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616488 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerName="ceilometer-notification-agent" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616502 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a069955e-f546-4522-97ec-5a529f79b1aa" containerName="mariadb-database-create" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616511 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe" containerName="mariadb-database-create" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.616520 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" containerName="sg-core" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.618513 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.619601 4792 scope.go:117] "RemoveContainer" containerID="2974b52b9fe70c71e1067f1e48fbfe7f66447a69916e1c15a62b5a2377bf5549" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.621044 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.621163 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.680636 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.684684 4792 scope.go:117] "RemoveContainer" containerID="20059357b7d0dcf77c24de3fd3876e0d7701b71cfa87b07f6aaaa2491b33046c" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.702062 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.702607 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-run-httpd\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.702761 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b67jc\" (UniqueName: \"kubernetes.io/projected/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-kube-api-access-b67jc\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.702834 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-scripts\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.702918 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-log-httpd\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.702997 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.703075 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-config-data\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.804503 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-config-data\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.805374 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.805397 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-run-httpd\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.805749 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b67jc\" (UniqueName: \"kubernetes.io/projected/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-kube-api-access-b67jc\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.805779 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-scripts\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.805799 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-log-httpd\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.805831 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.806174 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-run-httpd\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.806516 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-log-httpd\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.810364 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.810988 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.811599 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-config-data\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.822575 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-scripts\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.828589 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b67jc\" (UniqueName: \"kubernetes.io/projected/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-kube-api-access-b67jc\") pod \"ceilometer-0\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " pod="openstack/ceilometer-0" Mar 01 09:29:38 crc kubenswrapper[4792]: I0301 09:29:38.947109 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:29:39 crc kubenswrapper[4792]: I0301 09:29:39.418092 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="594710f1-32aa-4acc-a8ea-8cfec7b2c28c" path="/var/lib/kubelet/pods/594710f1-32aa-4acc-a8ea-8cfec7b2c28c/volumes" Mar 01 09:29:39 crc kubenswrapper[4792]: I0301 09:29:39.445074 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:39 crc kubenswrapper[4792]: I0301 09:29:39.552322 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf","Type":"ContainerStarted","Data":"0f890f6f19c9d8250258adc7aeb8d26ea84175d18054680a23e399f6e29382bc"} Mar 01 09:29:40 crc kubenswrapper[4792]: I0301 09:29:40.561802 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf","Type":"ContainerStarted","Data":"03d77111dbe6af8e832c82b26c1b62e1ab0b322bd6de059889982bff5460bbf9"} Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.077854 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tp5l7"] Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.079781 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-tp5l7" Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.088152 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-hfrkw" Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.089332 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.092623 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.102095 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tp5l7"] Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.178073 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-tp5l7\" (UID: \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\") " pod="openstack/nova-cell0-conductor-db-sync-tp5l7" Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.178145 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-config-data\") pod \"nova-cell0-conductor-db-sync-tp5l7\" (UID: \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\") " pod="openstack/nova-cell0-conductor-db-sync-tp5l7" Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.178196 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4jjf\" (UniqueName: \"kubernetes.io/projected/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-kube-api-access-j4jjf\") pod \"nova-cell0-conductor-db-sync-tp5l7\" (UID: \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\") " pod="openstack/nova-cell0-conductor-db-sync-tp5l7" Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.178225 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-scripts\") pod \"nova-cell0-conductor-db-sync-tp5l7\" (UID: \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\") " pod="openstack/nova-cell0-conductor-db-sync-tp5l7" Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.279477 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-config-data\") pod \"nova-cell0-conductor-db-sync-tp5l7\" (UID: \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\") " pod="openstack/nova-cell0-conductor-db-sync-tp5l7" Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.279756 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4jjf\" (UniqueName: \"kubernetes.io/projected/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-kube-api-access-j4jjf\") pod \"nova-cell0-conductor-db-sync-tp5l7\" (UID: \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\") " pod="openstack/nova-cell0-conductor-db-sync-tp5l7" Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.279779 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-scripts\") pod \"nova-cell0-conductor-db-sync-tp5l7\" (UID: \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\") " pod="openstack/nova-cell0-conductor-db-sync-tp5l7" Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.279869 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-tp5l7\" (UID: \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\") " pod="openstack/nova-cell0-conductor-db-sync-tp5l7" Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.305138 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-tp5l7\" (UID: \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\") " pod="openstack/nova-cell0-conductor-db-sync-tp5l7" Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.305161 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-config-data\") pod \"nova-cell0-conductor-db-sync-tp5l7\" (UID: \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\") " pod="openstack/nova-cell0-conductor-db-sync-tp5l7" Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.305488 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-scripts\") pod \"nova-cell0-conductor-db-sync-tp5l7\" (UID: \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\") " pod="openstack/nova-cell0-conductor-db-sync-tp5l7" Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.319401 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4jjf\" (UniqueName: \"kubernetes.io/projected/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-kube-api-access-j4jjf\") pod \"nova-cell0-conductor-db-sync-tp5l7\" (UID: \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\") " pod="openstack/nova-cell0-conductor-db-sync-tp5l7" Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.543245 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-tp5l7" Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.606551 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf","Type":"ContainerStarted","Data":"821a69d721c5f5e39af061016fb4dd3aae85412f19901bf33a586d3b5889a0bf"} Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.606592 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf","Type":"ContainerStarted","Data":"11adf9995e830da132d5cee2f25399dd37eec4af12f052d976a4771af2aa9ab7"} Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.614387 4792 generic.go:334] "Generic (PLEG): container finished" podID="947b32da-5664-42ff-a665-ac182dea1433" containerID="9288e4694d44c535f5db6a3588b5f57161da3432c7fdcf10f8abb56af7bd4be9" exitCode=0 Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.614426 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f7447dcd6-cpnn5" event={"ID":"947b32da-5664-42ff-a665-ac182dea1433","Type":"ContainerDied","Data":"9288e4694d44c535f5db6a3588b5f57161da3432c7fdcf10f8abb56af7bd4be9"} Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.952274 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.995785 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-internal-tls-certs\") pod \"947b32da-5664-42ff-a665-ac182dea1433\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.995848 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-public-tls-certs\") pod \"947b32da-5664-42ff-a665-ac182dea1433\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.995893 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9trll\" (UniqueName: \"kubernetes.io/projected/947b32da-5664-42ff-a665-ac182dea1433-kube-api-access-9trll\") pod \"947b32da-5664-42ff-a665-ac182dea1433\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.995995 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/947b32da-5664-42ff-a665-ac182dea1433-logs\") pod \"947b32da-5664-42ff-a665-ac182dea1433\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.996013 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-config-data\") pod \"947b32da-5664-42ff-a665-ac182dea1433\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.996064 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-scripts\") pod \"947b32da-5664-42ff-a665-ac182dea1433\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.996088 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-combined-ca-bundle\") pod \"947b32da-5664-42ff-a665-ac182dea1433\" (UID: \"947b32da-5664-42ff-a665-ac182dea1433\") " Mar 01 09:29:41 crc kubenswrapper[4792]: I0301 09:29:41.997310 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/947b32da-5664-42ff-a665-ac182dea1433-logs" (OuterVolumeSpecName: "logs") pod "947b32da-5664-42ff-a665-ac182dea1433" (UID: "947b32da-5664-42ff-a665-ac182dea1433"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.008546 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/947b32da-5664-42ff-a665-ac182dea1433-kube-api-access-9trll" (OuterVolumeSpecName: "kube-api-access-9trll") pod "947b32da-5664-42ff-a665-ac182dea1433" (UID: "947b32da-5664-42ff-a665-ac182dea1433"). InnerVolumeSpecName "kube-api-access-9trll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.020040 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-scripts" (OuterVolumeSpecName: "scripts") pod "947b32da-5664-42ff-a665-ac182dea1433" (UID: "947b32da-5664-42ff-a665-ac182dea1433"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.090217 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tp5l7"] Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.092022 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-config-data" (OuterVolumeSpecName: "config-data") pod "947b32da-5664-42ff-a665-ac182dea1433" (UID: "947b32da-5664-42ff-a665-ac182dea1433"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.097891 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9trll\" (UniqueName: \"kubernetes.io/projected/947b32da-5664-42ff-a665-ac182dea1433-kube-api-access-9trll\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.097927 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/947b32da-5664-42ff-a665-ac182dea1433-logs\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.097937 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.098033 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.112335 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "947b32da-5664-42ff-a665-ac182dea1433" (UID: "947b32da-5664-42ff-a665-ac182dea1433"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.134237 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "947b32da-5664-42ff-a665-ac182dea1433" (UID: "947b32da-5664-42ff-a665-ac182dea1433"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.139767 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "947b32da-5664-42ff-a665-ac182dea1433" (UID: "947b32da-5664-42ff-a665-ac182dea1433"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.199169 4792 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.199200 4792 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.199209 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/947b32da-5664-42ff-a665-ac182dea1433-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.624186 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-tp5l7" event={"ID":"de6ead5c-face-41ff-ab6e-aebb7ca73c1c","Type":"ContainerStarted","Data":"69bc546e1a44d03d4232782332f19345cbd8f578e040e6ba8b8c80c9abe934e9"} Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.626810 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f7447dcd6-cpnn5" event={"ID":"947b32da-5664-42ff-a665-ac182dea1433","Type":"ContainerDied","Data":"5d91235d937c34cd2f7190314bb2d67a56ac04c2b63da01a91c659eee91c9179"} Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.626858 4792 scope.go:117] "RemoveContainer" containerID="9288e4694d44c535f5db6a3588b5f57161da3432c7fdcf10f8abb56af7bd4be9" Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.627006 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f7447dcd6-cpnn5" Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.672229 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7f7447dcd6-cpnn5"] Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.678397 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-7f7447dcd6-cpnn5"] Mar 01 09:29:42 crc kubenswrapper[4792]: I0301 09:29:42.689308 4792 scope.go:117] "RemoveContainer" containerID="4555af75d6b8f403b02d065ff405189b215772f534b2956e72f7441d250ba2de" Mar 01 09:29:43 crc kubenswrapper[4792]: I0301 09:29:43.419628 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="947b32da-5664-42ff-a665-ac182dea1433" path="/var/lib/kubelet/pods/947b32da-5664-42ff-a665-ac182dea1433/volumes" Mar 01 09:29:43 crc kubenswrapper[4792]: I0301 09:29:43.644385 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf","Type":"ContainerStarted","Data":"b5e11cb105cfba6cc6777dcc36d6a7386137980e60c28e5e4cb193f8ef3736fb"} Mar 01 09:29:43 crc kubenswrapper[4792]: I0301 09:29:43.645573 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 01 09:29:43 crc kubenswrapper[4792]: I0301 09:29:43.673046 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.319227069 podStartE2EDuration="5.672999376s" podCreationTimestamp="2026-03-01 09:29:38 +0000 UTC" firstStartedPulling="2026-03-01 09:29:39.445057809 +0000 UTC m=+1308.686937016" lastFinishedPulling="2026-03-01 09:29:42.798830126 +0000 UTC m=+1312.040709323" observedRunningTime="2026-03-01 09:29:43.668606111 +0000 UTC m=+1312.910485308" watchObservedRunningTime="2026-03-01 09:29:43.672999376 +0000 UTC m=+1312.914878573" Mar 01 09:29:45 crc kubenswrapper[4792]: I0301 09:29:45.321371 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:46 crc kubenswrapper[4792]: I0301 09:29:46.665763 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerName="ceilometer-central-agent" containerID="cri-o://03d77111dbe6af8e832c82b26c1b62e1ab0b322bd6de059889982bff5460bbf9" gracePeriod=30 Mar 01 09:29:46 crc kubenswrapper[4792]: I0301 09:29:46.665816 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerName="ceilometer-notification-agent" containerID="cri-o://11adf9995e830da132d5cee2f25399dd37eec4af12f052d976a4771af2aa9ab7" gracePeriod=30 Mar 01 09:29:46 crc kubenswrapper[4792]: I0301 09:29:46.665810 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerName="proxy-httpd" containerID="cri-o://b5e11cb105cfba6cc6777dcc36d6a7386137980e60c28e5e4cb193f8ef3736fb" gracePeriod=30 Mar 01 09:29:46 crc kubenswrapper[4792]: I0301 09:29:46.665840 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerName="sg-core" containerID="cri-o://821a69d721c5f5e39af061016fb4dd3aae85412f19901bf33a586d3b5889a0bf" gracePeriod=30 Mar 01 09:29:47 crc kubenswrapper[4792]: I0301 09:29:47.682980 4792 generic.go:334] "Generic (PLEG): container finished" podID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerID="b5e11cb105cfba6cc6777dcc36d6a7386137980e60c28e5e4cb193f8ef3736fb" exitCode=0 Mar 01 09:29:47 crc kubenswrapper[4792]: I0301 09:29:47.683259 4792 generic.go:334] "Generic (PLEG): container finished" podID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerID="821a69d721c5f5e39af061016fb4dd3aae85412f19901bf33a586d3b5889a0bf" exitCode=2 Mar 01 09:29:47 crc kubenswrapper[4792]: I0301 09:29:47.683269 4792 generic.go:334] "Generic (PLEG): container finished" podID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerID="11adf9995e830da132d5cee2f25399dd37eec4af12f052d976a4771af2aa9ab7" exitCode=0 Mar 01 09:29:47 crc kubenswrapper[4792]: I0301 09:29:47.683276 4792 generic.go:334] "Generic (PLEG): container finished" podID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerID="03d77111dbe6af8e832c82b26c1b62e1ab0b322bd6de059889982bff5460bbf9" exitCode=0 Mar 01 09:29:47 crc kubenswrapper[4792]: I0301 09:29:47.683294 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf","Type":"ContainerDied","Data":"b5e11cb105cfba6cc6777dcc36d6a7386137980e60c28e5e4cb193f8ef3736fb"} Mar 01 09:29:47 crc kubenswrapper[4792]: I0301 09:29:47.683318 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf","Type":"ContainerDied","Data":"821a69d721c5f5e39af061016fb4dd3aae85412f19901bf33a586d3b5889a0bf"} Mar 01 09:29:47 crc kubenswrapper[4792]: I0301 09:29:47.683328 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf","Type":"ContainerDied","Data":"11adf9995e830da132d5cee2f25399dd37eec4af12f052d976a4771af2aa9ab7"} Mar 01 09:29:47 crc kubenswrapper[4792]: I0301 09:29:47.683337 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf","Type":"ContainerDied","Data":"03d77111dbe6af8e832c82b26c1b62e1ab0b322bd6de059889982bff5460bbf9"} Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.245922 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.398304 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-run-httpd\") pod \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.398380 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-combined-ca-bundle\") pod \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.398430 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b67jc\" (UniqueName: \"kubernetes.io/projected/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-kube-api-access-b67jc\") pod \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.398487 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-log-httpd\") pod \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.398511 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-config-data\") pod \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.398527 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-scripts\") pod \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.398570 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-sg-core-conf-yaml\") pod \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\" (UID: \"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf\") " Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.399102 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" (UID: "f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.399373 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" (UID: "f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.399683 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.399703 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.403229 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-scripts" (OuterVolumeSpecName: "scripts") pod "f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" (UID: "f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.403543 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-kube-api-access-b67jc" (OuterVolumeSpecName: "kube-api-access-b67jc") pod "f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" (UID: "f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf"). InnerVolumeSpecName "kube-api-access-b67jc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.448533 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" (UID: "f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.499139 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" (UID: "f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.501359 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.501402 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b67jc\" (UniqueName: \"kubernetes.io/projected/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-kube-api-access-b67jc\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.501425 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.501442 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.538053 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-config-data" (OuterVolumeSpecName: "config-data") pod "f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" (UID: "f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.603101 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.726428 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-tp5l7" event={"ID":"de6ead5c-face-41ff-ab6e-aebb7ca73c1c","Type":"ContainerStarted","Data":"b0e1e09f850992f4661d6be2a0a76260a157f0e5c0875fd88ff0bc92644a8d13"} Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.731939 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf","Type":"ContainerDied","Data":"0f890f6f19c9d8250258adc7aeb8d26ea84175d18054680a23e399f6e29382bc"} Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.732002 4792 scope.go:117] "RemoveContainer" containerID="b5e11cb105cfba6cc6777dcc36d6a7386137980e60c28e5e4cb193f8ef3736fb" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.732174 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.769678 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-tp5l7" podStartSLOduration=1.869845159 podStartE2EDuration="10.769653728s" podCreationTimestamp="2026-03-01 09:29:41 +0000 UTC" firstStartedPulling="2026-03-01 09:29:42.099768424 +0000 UTC m=+1311.341647621" lastFinishedPulling="2026-03-01 09:29:50.999576993 +0000 UTC m=+1320.241456190" observedRunningTime="2026-03-01 09:29:51.743455629 +0000 UTC m=+1320.985334866" watchObservedRunningTime="2026-03-01 09:29:51.769653728 +0000 UTC m=+1321.011532925" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.771769 4792 scope.go:117] "RemoveContainer" containerID="821a69d721c5f5e39af061016fb4dd3aae85412f19901bf33a586d3b5889a0bf" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.787237 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.794639 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.834451 4792 scope.go:117] "RemoveContainer" containerID="11adf9995e830da132d5cee2f25399dd37eec4af12f052d976a4771af2aa9ab7" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.842730 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:51 crc kubenswrapper[4792]: E0301 09:29:51.843177 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerName="sg-core" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.843191 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerName="sg-core" Mar 01 09:29:51 crc kubenswrapper[4792]: E0301 09:29:51.843213 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="947b32da-5664-42ff-a665-ac182dea1433" containerName="placement-log" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.843220 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="947b32da-5664-42ff-a665-ac182dea1433" containerName="placement-log" Mar 01 09:29:51 crc kubenswrapper[4792]: E0301 09:29:51.843234 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerName="proxy-httpd" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.843240 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerName="proxy-httpd" Mar 01 09:29:51 crc kubenswrapper[4792]: E0301 09:29:51.843252 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="947b32da-5664-42ff-a665-ac182dea1433" containerName="placement-api" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.843257 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="947b32da-5664-42ff-a665-ac182dea1433" containerName="placement-api" Mar 01 09:29:51 crc kubenswrapper[4792]: E0301 09:29:51.843273 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerName="ceilometer-central-agent" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.843279 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerName="ceilometer-central-agent" Mar 01 09:29:51 crc kubenswrapper[4792]: E0301 09:29:51.843291 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerName="ceilometer-notification-agent" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.843297 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerName="ceilometer-notification-agent" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.843454 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerName="ceilometer-central-agent" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.843471 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerName="ceilometer-notification-agent" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.843480 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="947b32da-5664-42ff-a665-ac182dea1433" containerName="placement-log" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.843490 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="947b32da-5664-42ff-a665-ac182dea1433" containerName="placement-api" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.843499 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerName="proxy-httpd" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.843509 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" containerName="sg-core" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.845407 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.849583 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.852142 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.859221 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.880075 4792 scope.go:117] "RemoveContainer" containerID="03d77111dbe6af8e832c82b26c1b62e1ab0b322bd6de059889982bff5460bbf9" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.909452 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.910280 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-config-data\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.910312 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a01d08c-d6df-4d6f-8541-1900fdc49572-log-httpd\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.910344 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gxqc\" (UniqueName: \"kubernetes.io/projected/0a01d08c-d6df-4d6f-8541-1900fdc49572-kube-api-access-5gxqc\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.910402 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.910423 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-scripts\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:51 crc kubenswrapper[4792]: I0301 09:29:51.910455 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a01d08c-d6df-4d6f-8541-1900fdc49572-run-httpd\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:52 crc kubenswrapper[4792]: I0301 09:29:52.011722 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:52 crc kubenswrapper[4792]: I0301 09:29:52.011849 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-config-data\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:52 crc kubenswrapper[4792]: I0301 09:29:52.011869 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a01d08c-d6df-4d6f-8541-1900fdc49572-log-httpd\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:52 crc kubenswrapper[4792]: I0301 09:29:52.012375 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a01d08c-d6df-4d6f-8541-1900fdc49572-log-httpd\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:52 crc kubenswrapper[4792]: I0301 09:29:52.012401 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gxqc\" (UniqueName: \"kubernetes.io/projected/0a01d08c-d6df-4d6f-8541-1900fdc49572-kube-api-access-5gxqc\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:52 crc kubenswrapper[4792]: I0301 09:29:52.012443 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:52 crc kubenswrapper[4792]: I0301 09:29:52.012505 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-scripts\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:52 crc kubenswrapper[4792]: I0301 09:29:52.012533 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a01d08c-d6df-4d6f-8541-1900fdc49572-run-httpd\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:52 crc kubenswrapper[4792]: I0301 09:29:52.012864 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a01d08c-d6df-4d6f-8541-1900fdc49572-run-httpd\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:52 crc kubenswrapper[4792]: I0301 09:29:52.015391 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:52 crc kubenswrapper[4792]: I0301 09:29:52.017833 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-config-data\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:52 crc kubenswrapper[4792]: I0301 09:29:52.024080 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:52 crc kubenswrapper[4792]: I0301 09:29:52.029618 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-scripts\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:52 crc kubenswrapper[4792]: I0301 09:29:52.031861 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gxqc\" (UniqueName: \"kubernetes.io/projected/0a01d08c-d6df-4d6f-8541-1900fdc49572-kube-api-access-5gxqc\") pod \"ceilometer-0\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " pod="openstack/ceilometer-0" Mar 01 09:29:52 crc kubenswrapper[4792]: I0301 09:29:52.176034 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:29:52 crc kubenswrapper[4792]: I0301 09:29:52.689295 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:52 crc kubenswrapper[4792]: W0301 09:29:52.695152 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a01d08c_d6df_4d6f_8541_1900fdc49572.slice/crio-e73ae627714e9a71df05613aefebb2895fe9d5a5671a7f1ed0848a6a3b3c37e9 WatchSource:0}: Error finding container e73ae627714e9a71df05613aefebb2895fe9d5a5671a7f1ed0848a6a3b3c37e9: Status 404 returned error can't find the container with id e73ae627714e9a71df05613aefebb2895fe9d5a5671a7f1ed0848a6a3b3c37e9 Mar 01 09:29:52 crc kubenswrapper[4792]: I0301 09:29:52.741075 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a01d08c-d6df-4d6f-8541-1900fdc49572","Type":"ContainerStarted","Data":"e73ae627714e9a71df05613aefebb2895fe9d5a5671a7f1ed0848a6a3b3c37e9"} Mar 01 09:29:52 crc kubenswrapper[4792]: I0301 09:29:52.942681 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:29:53 crc kubenswrapper[4792]: I0301 09:29:53.425729 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf" path="/var/lib/kubelet/pods/f8e3e6bc-55dd-4d85-8c06-8ae9376e9cdf/volumes" Mar 01 09:29:53 crc kubenswrapper[4792]: I0301 09:29:53.749943 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a01d08c-d6df-4d6f-8541-1900fdc49572","Type":"ContainerStarted","Data":"92ac0ec70a0ab187bdd68568f0762d2d4a87c7aac17cf322ee9c2a9beff90c0f"} Mar 01 09:29:54 crc kubenswrapper[4792]: I0301 09:29:54.758664 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a01d08c-d6df-4d6f-8541-1900fdc49572","Type":"ContainerStarted","Data":"1262adca1515a35db418cfd53c93f2e1127fb911c650764e91834954617701cc"} Mar 01 09:29:54 crc kubenswrapper[4792]: I0301 09:29:54.760026 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a01d08c-d6df-4d6f-8541-1900fdc49572","Type":"ContainerStarted","Data":"a38454a03f115d37c4fd02183858fd0260647cc7fdf0118804aa752bded945c5"} Mar 01 09:29:57 crc kubenswrapper[4792]: I0301 09:29:57.782703 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a01d08c-d6df-4d6f-8541-1900fdc49572","Type":"ContainerStarted","Data":"d0d62026e9cee2a3842a4b40df8aa2796c73539e781c96bf2099aa7fe76f801f"} Mar 01 09:29:57 crc kubenswrapper[4792]: I0301 09:29:57.783402 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerName="ceilometer-central-agent" containerID="cri-o://92ac0ec70a0ab187bdd68568f0762d2d4a87c7aac17cf322ee9c2a9beff90c0f" gracePeriod=30 Mar 01 09:29:57 crc kubenswrapper[4792]: I0301 09:29:57.783645 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 01 09:29:57 crc kubenswrapper[4792]: I0301 09:29:57.784178 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerName="proxy-httpd" containerID="cri-o://d0d62026e9cee2a3842a4b40df8aa2796c73539e781c96bf2099aa7fe76f801f" gracePeriod=30 Mar 01 09:29:57 crc kubenswrapper[4792]: I0301 09:29:57.784226 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerName="sg-core" containerID="cri-o://1262adca1515a35db418cfd53c93f2e1127fb911c650764e91834954617701cc" gracePeriod=30 Mar 01 09:29:57 crc kubenswrapper[4792]: I0301 09:29:57.784260 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerName="ceilometer-notification-agent" containerID="cri-o://a38454a03f115d37c4fd02183858fd0260647cc7fdf0118804aa752bded945c5" gracePeriod=30 Mar 01 09:29:57 crc kubenswrapper[4792]: I0301 09:29:57.811486 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.375608685 podStartE2EDuration="6.811467158s" podCreationTimestamp="2026-03-01 09:29:51 +0000 UTC" firstStartedPulling="2026-03-01 09:29:52.698312165 +0000 UTC m=+1321.940191382" lastFinishedPulling="2026-03-01 09:29:57.134170658 +0000 UTC m=+1326.376049855" observedRunningTime="2026-03-01 09:29:57.804702216 +0000 UTC m=+1327.046581413" watchObservedRunningTime="2026-03-01 09:29:57.811467158 +0000 UTC m=+1327.053346345" Mar 01 09:29:58 crc kubenswrapper[4792]: I0301 09:29:58.809319 4792 generic.go:334] "Generic (PLEG): container finished" podID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerID="d0d62026e9cee2a3842a4b40df8aa2796c73539e781c96bf2099aa7fe76f801f" exitCode=0 Mar 01 09:29:58 crc kubenswrapper[4792]: I0301 09:29:58.810149 4792 generic.go:334] "Generic (PLEG): container finished" podID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerID="1262adca1515a35db418cfd53c93f2e1127fb911c650764e91834954617701cc" exitCode=2 Mar 01 09:29:58 crc kubenswrapper[4792]: I0301 09:29:58.810225 4792 generic.go:334] "Generic (PLEG): container finished" podID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerID="a38454a03f115d37c4fd02183858fd0260647cc7fdf0118804aa752bded945c5" exitCode=0 Mar 01 09:29:58 crc kubenswrapper[4792]: I0301 09:29:58.809385 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a01d08c-d6df-4d6f-8541-1900fdc49572","Type":"ContainerDied","Data":"d0d62026e9cee2a3842a4b40df8aa2796c73539e781c96bf2099aa7fe76f801f"} Mar 01 09:29:58 crc kubenswrapper[4792]: I0301 09:29:58.810363 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a01d08c-d6df-4d6f-8541-1900fdc49572","Type":"ContainerDied","Data":"1262adca1515a35db418cfd53c93f2e1127fb911c650764e91834954617701cc"} Mar 01 09:29:58 crc kubenswrapper[4792]: I0301 09:29:58.810435 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a01d08c-d6df-4d6f-8541-1900fdc49572","Type":"ContainerDied","Data":"a38454a03f115d37c4fd02183858fd0260647cc7fdf0118804aa752bded945c5"} Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.137013 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24"] Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.138781 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.144661 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539290-npvbb"] Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.145708 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.145811 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539290-npvbb" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.154457 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.154883 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.155024 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.155665 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539290-npvbb"] Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.157552 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.164963 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24"] Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.265257 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjrfl\" (UniqueName: \"kubernetes.io/projected/29833925-b21b-44d4-954c-e3252e5e69c4-kube-api-access-mjrfl\") pod \"collect-profiles-29539290-wch24\" (UID: \"29833925-b21b-44d4-954c-e3252e5e69c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.265331 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxxtb\" (UniqueName: \"kubernetes.io/projected/d3644e57-7093-4402-a6f2-48ed10ac14fa-kube-api-access-nxxtb\") pod \"auto-csr-approver-29539290-npvbb\" (UID: \"d3644e57-7093-4402-a6f2-48ed10ac14fa\") " pod="openshift-infra/auto-csr-approver-29539290-npvbb" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.265391 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/29833925-b21b-44d4-954c-e3252e5e69c4-secret-volume\") pod \"collect-profiles-29539290-wch24\" (UID: \"29833925-b21b-44d4-954c-e3252e5e69c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.265416 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29833925-b21b-44d4-954c-e3252e5e69c4-config-volume\") pod \"collect-profiles-29539290-wch24\" (UID: \"29833925-b21b-44d4-954c-e3252e5e69c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.367280 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/29833925-b21b-44d4-954c-e3252e5e69c4-secret-volume\") pod \"collect-profiles-29539290-wch24\" (UID: \"29833925-b21b-44d4-954c-e3252e5e69c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.367342 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29833925-b21b-44d4-954c-e3252e5e69c4-config-volume\") pod \"collect-profiles-29539290-wch24\" (UID: \"29833925-b21b-44d4-954c-e3252e5e69c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.367429 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjrfl\" (UniqueName: \"kubernetes.io/projected/29833925-b21b-44d4-954c-e3252e5e69c4-kube-api-access-mjrfl\") pod \"collect-profiles-29539290-wch24\" (UID: \"29833925-b21b-44d4-954c-e3252e5e69c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.367506 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxxtb\" (UniqueName: \"kubernetes.io/projected/d3644e57-7093-4402-a6f2-48ed10ac14fa-kube-api-access-nxxtb\") pod \"auto-csr-approver-29539290-npvbb\" (UID: \"d3644e57-7093-4402-a6f2-48ed10ac14fa\") " pod="openshift-infra/auto-csr-approver-29539290-npvbb" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.368353 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29833925-b21b-44d4-954c-e3252e5e69c4-config-volume\") pod \"collect-profiles-29539290-wch24\" (UID: \"29833925-b21b-44d4-954c-e3252e5e69c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.377671 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/29833925-b21b-44d4-954c-e3252e5e69c4-secret-volume\") pod \"collect-profiles-29539290-wch24\" (UID: \"29833925-b21b-44d4-954c-e3252e5e69c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.388525 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjrfl\" (UniqueName: \"kubernetes.io/projected/29833925-b21b-44d4-954c-e3252e5e69c4-kube-api-access-mjrfl\") pod \"collect-profiles-29539290-wch24\" (UID: \"29833925-b21b-44d4-954c-e3252e5e69c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.393148 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxxtb\" (UniqueName: \"kubernetes.io/projected/d3644e57-7093-4402-a6f2-48ed10ac14fa-kube-api-access-nxxtb\") pod \"auto-csr-approver-29539290-npvbb\" (UID: \"d3644e57-7093-4402-a6f2-48ed10ac14fa\") " pod="openshift-infra/auto-csr-approver-29539290-npvbb" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.464352 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.479346 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539290-npvbb" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.756171 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.836531 4792 generic.go:334] "Generic (PLEG): container finished" podID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerID="92ac0ec70a0ab187bdd68568f0762d2d4a87c7aac17cf322ee9c2a9beff90c0f" exitCode=0 Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.836570 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a01d08c-d6df-4d6f-8541-1900fdc49572","Type":"ContainerDied","Data":"92ac0ec70a0ab187bdd68568f0762d2d4a87c7aac17cf322ee9c2a9beff90c0f"} Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.836601 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a01d08c-d6df-4d6f-8541-1900fdc49572","Type":"ContainerDied","Data":"e73ae627714e9a71df05613aefebb2895fe9d5a5671a7f1ed0848a6a3b3c37e9"} Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.836617 4792 scope.go:117] "RemoveContainer" containerID="d0d62026e9cee2a3842a4b40df8aa2796c73539e781c96bf2099aa7fe76f801f" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.836743 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.866155 4792 scope.go:117] "RemoveContainer" containerID="1262adca1515a35db418cfd53c93f2e1127fb911c650764e91834954617701cc" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.880223 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a01d08c-d6df-4d6f-8541-1900fdc49572-log-httpd\") pod \"0a01d08c-d6df-4d6f-8541-1900fdc49572\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.880290 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-combined-ca-bundle\") pod \"0a01d08c-d6df-4d6f-8541-1900fdc49572\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.880406 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-config-data\") pod \"0a01d08c-d6df-4d6f-8541-1900fdc49572\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.880429 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a01d08c-d6df-4d6f-8541-1900fdc49572-run-httpd\") pod \"0a01d08c-d6df-4d6f-8541-1900fdc49572\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.880462 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-scripts\") pod \"0a01d08c-d6df-4d6f-8541-1900fdc49572\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.880516 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-sg-core-conf-yaml\") pod \"0a01d08c-d6df-4d6f-8541-1900fdc49572\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.880949 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gxqc\" (UniqueName: \"kubernetes.io/projected/0a01d08c-d6df-4d6f-8541-1900fdc49572-kube-api-access-5gxqc\") pod \"0a01d08c-d6df-4d6f-8541-1900fdc49572\" (UID: \"0a01d08c-d6df-4d6f-8541-1900fdc49572\") " Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.882337 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a01d08c-d6df-4d6f-8541-1900fdc49572-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0a01d08c-d6df-4d6f-8541-1900fdc49572" (UID: "0a01d08c-d6df-4d6f-8541-1900fdc49572"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.882878 4792 scope.go:117] "RemoveContainer" containerID="a38454a03f115d37c4fd02183858fd0260647cc7fdf0118804aa752bded945c5" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.882928 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a01d08c-d6df-4d6f-8541-1900fdc49572-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0a01d08c-d6df-4d6f-8541-1900fdc49572" (UID: "0a01d08c-d6df-4d6f-8541-1900fdc49572"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.888326 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a01d08c-d6df-4d6f-8541-1900fdc49572-kube-api-access-5gxqc" (OuterVolumeSpecName: "kube-api-access-5gxqc") pod "0a01d08c-d6df-4d6f-8541-1900fdc49572" (UID: "0a01d08c-d6df-4d6f-8541-1900fdc49572"). InnerVolumeSpecName "kube-api-access-5gxqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.888628 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-scripts" (OuterVolumeSpecName: "scripts") pod "0a01d08c-d6df-4d6f-8541-1900fdc49572" (UID: "0a01d08c-d6df-4d6f-8541-1900fdc49572"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.911144 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0a01d08c-d6df-4d6f-8541-1900fdc49572" (UID: "0a01d08c-d6df-4d6f-8541-1900fdc49572"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.912029 4792 scope.go:117] "RemoveContainer" containerID="92ac0ec70a0ab187bdd68568f0762d2d4a87c7aac17cf322ee9c2a9beff90c0f" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.937877 4792 scope.go:117] "RemoveContainer" containerID="d0d62026e9cee2a3842a4b40df8aa2796c73539e781c96bf2099aa7fe76f801f" Mar 01 09:30:00 crc kubenswrapper[4792]: E0301 09:30:00.938232 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0d62026e9cee2a3842a4b40df8aa2796c73539e781c96bf2099aa7fe76f801f\": container with ID starting with d0d62026e9cee2a3842a4b40df8aa2796c73539e781c96bf2099aa7fe76f801f not found: ID does not exist" containerID="d0d62026e9cee2a3842a4b40df8aa2796c73539e781c96bf2099aa7fe76f801f" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.938256 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0d62026e9cee2a3842a4b40df8aa2796c73539e781c96bf2099aa7fe76f801f"} err="failed to get container status \"d0d62026e9cee2a3842a4b40df8aa2796c73539e781c96bf2099aa7fe76f801f\": rpc error: code = NotFound desc = could not find container \"d0d62026e9cee2a3842a4b40df8aa2796c73539e781c96bf2099aa7fe76f801f\": container with ID starting with d0d62026e9cee2a3842a4b40df8aa2796c73539e781c96bf2099aa7fe76f801f not found: ID does not exist" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.938275 4792 scope.go:117] "RemoveContainer" containerID="1262adca1515a35db418cfd53c93f2e1127fb911c650764e91834954617701cc" Mar 01 09:30:00 crc kubenswrapper[4792]: E0301 09:30:00.938640 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1262adca1515a35db418cfd53c93f2e1127fb911c650764e91834954617701cc\": container with ID starting with 1262adca1515a35db418cfd53c93f2e1127fb911c650764e91834954617701cc not found: ID does not exist" containerID="1262adca1515a35db418cfd53c93f2e1127fb911c650764e91834954617701cc" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.938660 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1262adca1515a35db418cfd53c93f2e1127fb911c650764e91834954617701cc"} err="failed to get container status \"1262adca1515a35db418cfd53c93f2e1127fb911c650764e91834954617701cc\": rpc error: code = NotFound desc = could not find container \"1262adca1515a35db418cfd53c93f2e1127fb911c650764e91834954617701cc\": container with ID starting with 1262adca1515a35db418cfd53c93f2e1127fb911c650764e91834954617701cc not found: ID does not exist" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.938673 4792 scope.go:117] "RemoveContainer" containerID="a38454a03f115d37c4fd02183858fd0260647cc7fdf0118804aa752bded945c5" Mar 01 09:30:00 crc kubenswrapper[4792]: E0301 09:30:00.938888 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a38454a03f115d37c4fd02183858fd0260647cc7fdf0118804aa752bded945c5\": container with ID starting with a38454a03f115d37c4fd02183858fd0260647cc7fdf0118804aa752bded945c5 not found: ID does not exist" containerID="a38454a03f115d37c4fd02183858fd0260647cc7fdf0118804aa752bded945c5" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.938917 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a38454a03f115d37c4fd02183858fd0260647cc7fdf0118804aa752bded945c5"} err="failed to get container status \"a38454a03f115d37c4fd02183858fd0260647cc7fdf0118804aa752bded945c5\": rpc error: code = NotFound desc = could not find container \"a38454a03f115d37c4fd02183858fd0260647cc7fdf0118804aa752bded945c5\": container with ID starting with a38454a03f115d37c4fd02183858fd0260647cc7fdf0118804aa752bded945c5 not found: ID does not exist" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.938931 4792 scope.go:117] "RemoveContainer" containerID="92ac0ec70a0ab187bdd68568f0762d2d4a87c7aac17cf322ee9c2a9beff90c0f" Mar 01 09:30:00 crc kubenswrapper[4792]: E0301 09:30:00.939162 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92ac0ec70a0ab187bdd68568f0762d2d4a87c7aac17cf322ee9c2a9beff90c0f\": container with ID starting with 92ac0ec70a0ab187bdd68568f0762d2d4a87c7aac17cf322ee9c2a9beff90c0f not found: ID does not exist" containerID="92ac0ec70a0ab187bdd68568f0762d2d4a87c7aac17cf322ee9c2a9beff90c0f" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.939183 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92ac0ec70a0ab187bdd68568f0762d2d4a87c7aac17cf322ee9c2a9beff90c0f"} err="failed to get container status \"92ac0ec70a0ab187bdd68568f0762d2d4a87c7aac17cf322ee9c2a9beff90c0f\": rpc error: code = NotFound desc = could not find container \"92ac0ec70a0ab187bdd68568f0762d2d4a87c7aac17cf322ee9c2a9beff90c0f\": container with ID starting with 92ac0ec70a0ab187bdd68568f0762d2d4a87c7aac17cf322ee9c2a9beff90c0f not found: ID does not exist" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.976280 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a01d08c-d6df-4d6f-8541-1900fdc49572" (UID: "0a01d08c-d6df-4d6f-8541-1900fdc49572"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.978869 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-config-data" (OuterVolumeSpecName: "config-data") pod "0a01d08c-d6df-4d6f-8541-1900fdc49572" (UID: "0a01d08c-d6df-4d6f-8541-1900fdc49572"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.982846 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gxqc\" (UniqueName: \"kubernetes.io/projected/0a01d08c-d6df-4d6f-8541-1900fdc49572-kube-api-access-5gxqc\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.982944 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a01d08c-d6df-4d6f-8541-1900fdc49572-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.983020 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.983077 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.983130 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a01d08c-d6df-4d6f-8541-1900fdc49572-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.983190 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:00 crc kubenswrapper[4792]: I0301 09:30:00.983252 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a01d08c-d6df-4d6f-8541-1900fdc49572-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.109166 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24"] Mar 01 09:30:01 crc kubenswrapper[4792]: W0301 09:30:01.109588 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29833925_b21b_44d4_954c_e3252e5e69c4.slice/crio-7de1da900ab59863539f4e6656b31f82070f579ba91efc05ea10d5e6c1d0ff39 WatchSource:0}: Error finding container 7de1da900ab59863539f4e6656b31f82070f579ba91efc05ea10d5e6c1d0ff39: Status 404 returned error can't find the container with id 7de1da900ab59863539f4e6656b31f82070f579ba91efc05ea10d5e6c1d0ff39 Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.176494 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.192882 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.205374 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:30:01 crc kubenswrapper[4792]: E0301 09:30:01.205858 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerName="ceilometer-notification-agent" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.205881 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerName="ceilometer-notification-agent" Mar 01 09:30:01 crc kubenswrapper[4792]: E0301 09:30:01.205926 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerName="ceilometer-central-agent" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.205936 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerName="ceilometer-central-agent" Mar 01 09:30:01 crc kubenswrapper[4792]: E0301 09:30:01.205984 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerName="proxy-httpd" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.205994 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerName="proxy-httpd" Mar 01 09:30:01 crc kubenswrapper[4792]: E0301 09:30:01.206021 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerName="sg-core" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.206062 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerName="sg-core" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.206257 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerName="proxy-httpd" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.206276 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerName="ceilometer-central-agent" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.206293 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerName="sg-core" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.206305 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a01d08c-d6df-4d6f-8541-1900fdc49572" containerName="ceilometer-notification-agent" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.211115 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.221668 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.222148 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.266607 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.282337 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539290-npvbb"] Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.297592 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-run-httpd\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.299673 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-config-data\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.299825 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.299992 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54hjp\" (UniqueName: \"kubernetes.io/projected/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-kube-api-access-54hjp\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.300136 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-scripts\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.300244 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.300359 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-log-httpd\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: E0301 09:30:01.375056 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a01d08c_d6df_4d6f_8541_1900fdc49572.slice\": RecentStats: unable to find data in memory cache]" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.402572 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-config-data\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.402652 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.402720 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54hjp\" (UniqueName: \"kubernetes.io/projected/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-kube-api-access-54hjp\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.402756 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-scripts\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.402802 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.402825 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-log-httpd\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.402932 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-run-httpd\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.403832 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-run-httpd\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.405006 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-log-httpd\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.409481 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.409828 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-config-data\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.411526 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.419424 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a01d08c-d6df-4d6f-8541-1900fdc49572" path="/var/lib/kubelet/pods/0a01d08c-d6df-4d6f-8541-1900fdc49572/volumes" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.422013 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-scripts\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.423962 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54hjp\" (UniqueName: \"kubernetes.io/projected/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-kube-api-access-54hjp\") pod \"ceilometer-0\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.534264 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.847303 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24" event={"ID":"29833925-b21b-44d4-954c-e3252e5e69c4","Type":"ContainerStarted","Data":"2ef9903d192bdc03acaf2fe74facecbf1f886cfe12a9446e1e544403dd9c2365"} Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.847348 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24" event={"ID":"29833925-b21b-44d4-954c-e3252e5e69c4","Type":"ContainerStarted","Data":"7de1da900ab59863539f4e6656b31f82070f579ba91efc05ea10d5e6c1d0ff39"} Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.849715 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539290-npvbb" event={"ID":"d3644e57-7093-4402-a6f2-48ed10ac14fa","Type":"ContainerStarted","Data":"4f151dd273dafbf61f585fc22ccdced51fd774a77ab507a61fa8f9d6f851d433"} Mar 01 09:30:01 crc kubenswrapper[4792]: I0301 09:30:01.868217 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24" podStartSLOduration=1.8682008300000001 podStartE2EDuration="1.86820083s" podCreationTimestamp="2026-03-01 09:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:30:01.863409425 +0000 UTC m=+1331.105288632" watchObservedRunningTime="2026-03-01 09:30:01.86820083 +0000 UTC m=+1331.110080027" Mar 01 09:30:02 crc kubenswrapper[4792]: I0301 09:30:02.003326 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:30:02 crc kubenswrapper[4792]: W0301 09:30:02.017447 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b9dfa7c_35ce_4f0d_9439_ed55e060a486.slice/crio-13d1178136245bb5d7f800642b1b34582d1e453c7a1057abd9a450d34e19ac11 WatchSource:0}: Error finding container 13d1178136245bb5d7f800642b1b34582d1e453c7a1057abd9a450d34e19ac11: Status 404 returned error can't find the container with id 13d1178136245bb5d7f800642b1b34582d1e453c7a1057abd9a450d34e19ac11 Mar 01 09:30:02 crc kubenswrapper[4792]: I0301 09:30:02.859939 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b9dfa7c-35ce-4f0d-9439-ed55e060a486","Type":"ContainerStarted","Data":"98dcb90013209d7c9aae298e15969a8ba76fb60c52fecc8c46812f9b8ef37321"} Mar 01 09:30:02 crc kubenswrapper[4792]: I0301 09:30:02.860238 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b9dfa7c-35ce-4f0d-9439-ed55e060a486","Type":"ContainerStarted","Data":"13d1178136245bb5d7f800642b1b34582d1e453c7a1057abd9a450d34e19ac11"} Mar 01 09:30:02 crc kubenswrapper[4792]: I0301 09:30:02.861516 4792 generic.go:334] "Generic (PLEG): container finished" podID="29833925-b21b-44d4-954c-e3252e5e69c4" containerID="2ef9903d192bdc03acaf2fe74facecbf1f886cfe12a9446e1e544403dd9c2365" exitCode=0 Mar 01 09:30:02 crc kubenswrapper[4792]: I0301 09:30:02.861550 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24" event={"ID":"29833925-b21b-44d4-954c-e3252e5e69c4","Type":"ContainerDied","Data":"2ef9903d192bdc03acaf2fe74facecbf1f886cfe12a9446e1e544403dd9c2365"} Mar 01 09:30:03 crc kubenswrapper[4792]: I0301 09:30:03.882098 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b9dfa7c-35ce-4f0d-9439-ed55e060a486","Type":"ContainerStarted","Data":"26d4385a852587fbc619f3d05dc7aef49f2ba1f691477d85d71993fa4441f74f"} Mar 01 09:30:03 crc kubenswrapper[4792]: I0301 09:30:03.882407 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b9dfa7c-35ce-4f0d-9439-ed55e060a486","Type":"ContainerStarted","Data":"b17992a1fa202276e607ebb67128eedd9320a4b892cde2ad53d17b1a493bd1e6"} Mar 01 09:30:03 crc kubenswrapper[4792]: I0301 09:30:03.886287 4792 generic.go:334] "Generic (PLEG): container finished" podID="de6ead5c-face-41ff-ab6e-aebb7ca73c1c" containerID="b0e1e09f850992f4661d6be2a0a76260a157f0e5c0875fd88ff0bc92644a8d13" exitCode=0 Mar 01 09:30:03 crc kubenswrapper[4792]: I0301 09:30:03.887292 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-tp5l7" event={"ID":"de6ead5c-face-41ff-ab6e-aebb7ca73c1c","Type":"ContainerDied","Data":"b0e1e09f850992f4661d6be2a0a76260a157f0e5c0875fd88ff0bc92644a8d13"} Mar 01 09:30:04 crc kubenswrapper[4792]: I0301 09:30:04.196269 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24" Mar 01 09:30:04 crc kubenswrapper[4792]: I0301 09:30:04.261718 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/29833925-b21b-44d4-954c-e3252e5e69c4-secret-volume\") pod \"29833925-b21b-44d4-954c-e3252e5e69c4\" (UID: \"29833925-b21b-44d4-954c-e3252e5e69c4\") " Mar 01 09:30:04 crc kubenswrapper[4792]: I0301 09:30:04.261850 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29833925-b21b-44d4-954c-e3252e5e69c4-config-volume\") pod \"29833925-b21b-44d4-954c-e3252e5e69c4\" (UID: \"29833925-b21b-44d4-954c-e3252e5e69c4\") " Mar 01 09:30:04 crc kubenswrapper[4792]: I0301 09:30:04.261962 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjrfl\" (UniqueName: \"kubernetes.io/projected/29833925-b21b-44d4-954c-e3252e5e69c4-kube-api-access-mjrfl\") pod \"29833925-b21b-44d4-954c-e3252e5e69c4\" (UID: \"29833925-b21b-44d4-954c-e3252e5e69c4\") " Mar 01 09:30:04 crc kubenswrapper[4792]: I0301 09:30:04.262618 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29833925-b21b-44d4-954c-e3252e5e69c4-config-volume" (OuterVolumeSpecName: "config-volume") pod "29833925-b21b-44d4-954c-e3252e5e69c4" (UID: "29833925-b21b-44d4-954c-e3252e5e69c4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:30:04 crc kubenswrapper[4792]: I0301 09:30:04.267024 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29833925-b21b-44d4-954c-e3252e5e69c4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "29833925-b21b-44d4-954c-e3252e5e69c4" (UID: "29833925-b21b-44d4-954c-e3252e5e69c4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:04 crc kubenswrapper[4792]: I0301 09:30:04.269010 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29833925-b21b-44d4-954c-e3252e5e69c4-kube-api-access-mjrfl" (OuterVolumeSpecName: "kube-api-access-mjrfl") pod "29833925-b21b-44d4-954c-e3252e5e69c4" (UID: "29833925-b21b-44d4-954c-e3252e5e69c4"). InnerVolumeSpecName "kube-api-access-mjrfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:30:04 crc kubenswrapper[4792]: I0301 09:30:04.365140 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjrfl\" (UniqueName: \"kubernetes.io/projected/29833925-b21b-44d4-954c-e3252e5e69c4-kube-api-access-mjrfl\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:04 crc kubenswrapper[4792]: I0301 09:30:04.365532 4792 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/29833925-b21b-44d4-954c-e3252e5e69c4-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:04 crc kubenswrapper[4792]: I0301 09:30:04.365593 4792 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29833925-b21b-44d4-954c-e3252e5e69c4-config-volume\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:04 crc kubenswrapper[4792]: I0301 09:30:04.895745 4792 generic.go:334] "Generic (PLEG): container finished" podID="d3644e57-7093-4402-a6f2-48ed10ac14fa" containerID="7926ce126d7f3dd092ea29933967e6329a351e44fde88116cf9663b118841513" exitCode=0 Mar 01 09:30:04 crc kubenswrapper[4792]: I0301 09:30:04.895808 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539290-npvbb" event={"ID":"d3644e57-7093-4402-a6f2-48ed10ac14fa","Type":"ContainerDied","Data":"7926ce126d7f3dd092ea29933967e6329a351e44fde88116cf9663b118841513"} Mar 01 09:30:04 crc kubenswrapper[4792]: I0301 09:30:04.898137 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24" Mar 01 09:30:04 crc kubenswrapper[4792]: I0301 09:30:04.898205 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24" event={"ID":"29833925-b21b-44d4-954c-e3252e5e69c4","Type":"ContainerDied","Data":"7de1da900ab59863539f4e6656b31f82070f579ba91efc05ea10d5e6c1d0ff39"} Mar 01 09:30:04 crc kubenswrapper[4792]: I0301 09:30:04.898246 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7de1da900ab59863539f4e6656b31f82070f579ba91efc05ea10d5e6c1d0ff39" Mar 01 09:30:04 crc kubenswrapper[4792]: I0301 09:30:04.943377 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:30:04 crc kubenswrapper[4792]: I0301 09:30:04.943436 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.198623 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-tp5l7" Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.285293 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4jjf\" (UniqueName: \"kubernetes.io/projected/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-kube-api-access-j4jjf\") pod \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\" (UID: \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\") " Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.285385 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-scripts\") pod \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\" (UID: \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\") " Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.285437 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-combined-ca-bundle\") pod \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\" (UID: \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\") " Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.285513 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-config-data\") pod \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\" (UID: \"de6ead5c-face-41ff-ab6e-aebb7ca73c1c\") " Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.291605 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-kube-api-access-j4jjf" (OuterVolumeSpecName: "kube-api-access-j4jjf") pod "de6ead5c-face-41ff-ab6e-aebb7ca73c1c" (UID: "de6ead5c-face-41ff-ab6e-aebb7ca73c1c"). InnerVolumeSpecName "kube-api-access-j4jjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.307153 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-scripts" (OuterVolumeSpecName: "scripts") pod "de6ead5c-face-41ff-ab6e-aebb7ca73c1c" (UID: "de6ead5c-face-41ff-ab6e-aebb7ca73c1c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.312714 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-config-data" (OuterVolumeSpecName: "config-data") pod "de6ead5c-face-41ff-ab6e-aebb7ca73c1c" (UID: "de6ead5c-face-41ff-ab6e-aebb7ca73c1c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.316936 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de6ead5c-face-41ff-ab6e-aebb7ca73c1c" (UID: "de6ead5c-face-41ff-ab6e-aebb7ca73c1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.387781 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4jjf\" (UniqueName: \"kubernetes.io/projected/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-kube-api-access-j4jjf\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.387816 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.387826 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.387840 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de6ead5c-face-41ff-ab6e-aebb7ca73c1c-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.907502 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-tp5l7" event={"ID":"de6ead5c-face-41ff-ab6e-aebb7ca73c1c","Type":"ContainerDied","Data":"69bc546e1a44d03d4232782332f19345cbd8f578e040e6ba8b8c80c9abe934e9"} Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.907539 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69bc546e1a44d03d4232782332f19345cbd8f578e040e6ba8b8c80c9abe934e9" Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.907587 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-tp5l7" Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.913061 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b9dfa7c-35ce-4f0d-9439-ed55e060a486","Type":"ContainerStarted","Data":"ea052cff5c038f68038711c9c5471ef75635b4c2956cef2d713687d1ae0e9463"} Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.913105 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 01 09:30:05 crc kubenswrapper[4792]: I0301 09:30:05.946581 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.413804943 podStartE2EDuration="4.946561921s" podCreationTimestamp="2026-03-01 09:30:01 +0000 UTC" firstStartedPulling="2026-03-01 09:30:02.019749034 +0000 UTC m=+1331.261628231" lastFinishedPulling="2026-03-01 09:30:05.552506012 +0000 UTC m=+1334.794385209" observedRunningTime="2026-03-01 09:30:05.939482401 +0000 UTC m=+1335.181361598" watchObservedRunningTime="2026-03-01 09:30:05.946561921 +0000 UTC m=+1335.188441118" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.006018 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 01 09:30:06 crc kubenswrapper[4792]: E0301 09:30:06.006476 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de6ead5c-face-41ff-ab6e-aebb7ca73c1c" containerName="nova-cell0-conductor-db-sync" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.006510 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="de6ead5c-face-41ff-ab6e-aebb7ca73c1c" containerName="nova-cell0-conductor-db-sync" Mar 01 09:30:06 crc kubenswrapper[4792]: E0301 09:30:06.006525 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29833925-b21b-44d4-954c-e3252e5e69c4" containerName="collect-profiles" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.006533 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="29833925-b21b-44d4-954c-e3252e5e69c4" containerName="collect-profiles" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.006724 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="29833925-b21b-44d4-954c-e3252e5e69c4" containerName="collect-profiles" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.006761 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="de6ead5c-face-41ff-ab6e-aebb7ca73c1c" containerName="nova-cell0-conductor-db-sync" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.008261 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.011165 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-hfrkw" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.022146 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.035807 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.098885 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f95aafcd-79b6-4ece-b3e1-ee9ea32a2754-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f95aafcd-79b6-4ece-b3e1-ee9ea32a2754\") " pod="openstack/nova-cell0-conductor-0" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.099099 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tffl\" (UniqueName: \"kubernetes.io/projected/f95aafcd-79b6-4ece-b3e1-ee9ea32a2754-kube-api-access-9tffl\") pod \"nova-cell0-conductor-0\" (UID: \"f95aafcd-79b6-4ece-b3e1-ee9ea32a2754\") " pod="openstack/nova-cell0-conductor-0" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.099175 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f95aafcd-79b6-4ece-b3e1-ee9ea32a2754-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f95aafcd-79b6-4ece-b3e1-ee9ea32a2754\") " pod="openstack/nova-cell0-conductor-0" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.200773 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tffl\" (UniqueName: \"kubernetes.io/projected/f95aafcd-79b6-4ece-b3e1-ee9ea32a2754-kube-api-access-9tffl\") pod \"nova-cell0-conductor-0\" (UID: \"f95aafcd-79b6-4ece-b3e1-ee9ea32a2754\") " pod="openstack/nova-cell0-conductor-0" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.200851 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f95aafcd-79b6-4ece-b3e1-ee9ea32a2754-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f95aafcd-79b6-4ece-b3e1-ee9ea32a2754\") " pod="openstack/nova-cell0-conductor-0" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.201047 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f95aafcd-79b6-4ece-b3e1-ee9ea32a2754-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f95aafcd-79b6-4ece-b3e1-ee9ea32a2754\") " pod="openstack/nova-cell0-conductor-0" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.207185 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f95aafcd-79b6-4ece-b3e1-ee9ea32a2754-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f95aafcd-79b6-4ece-b3e1-ee9ea32a2754\") " pod="openstack/nova-cell0-conductor-0" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.207682 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f95aafcd-79b6-4ece-b3e1-ee9ea32a2754-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f95aafcd-79b6-4ece-b3e1-ee9ea32a2754\") " pod="openstack/nova-cell0-conductor-0" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.218054 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tffl\" (UniqueName: \"kubernetes.io/projected/f95aafcd-79b6-4ece-b3e1-ee9ea32a2754-kube-api-access-9tffl\") pod \"nova-cell0-conductor-0\" (UID: \"f95aafcd-79b6-4ece-b3e1-ee9ea32a2754\") " pod="openstack/nova-cell0-conductor-0" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.304672 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539290-npvbb" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.325592 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.403960 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxxtb\" (UniqueName: \"kubernetes.io/projected/d3644e57-7093-4402-a6f2-48ed10ac14fa-kube-api-access-nxxtb\") pod \"d3644e57-7093-4402-a6f2-48ed10ac14fa\" (UID: \"d3644e57-7093-4402-a6f2-48ed10ac14fa\") " Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.419164 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3644e57-7093-4402-a6f2-48ed10ac14fa-kube-api-access-nxxtb" (OuterVolumeSpecName: "kube-api-access-nxxtb") pod "d3644e57-7093-4402-a6f2-48ed10ac14fa" (UID: "d3644e57-7093-4402-a6f2-48ed10ac14fa"). InnerVolumeSpecName "kube-api-access-nxxtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.507100 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxxtb\" (UniqueName: \"kubernetes.io/projected/d3644e57-7093-4402-a6f2-48ed10ac14fa-kube-api-access-nxxtb\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.745894 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 01 09:30:06 crc kubenswrapper[4792]: W0301 09:30:06.748034 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf95aafcd_79b6_4ece_b3e1_ee9ea32a2754.slice/crio-4eadc073525a6b5d36de1b41c86d60c694e2676d2f2e09432b3268b0c047e386 WatchSource:0}: Error finding container 4eadc073525a6b5d36de1b41c86d60c694e2676d2f2e09432b3268b0c047e386: Status 404 returned error can't find the container with id 4eadc073525a6b5d36de1b41c86d60c694e2676d2f2e09432b3268b0c047e386 Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.922109 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539290-npvbb" event={"ID":"d3644e57-7093-4402-a6f2-48ed10ac14fa","Type":"ContainerDied","Data":"4f151dd273dafbf61f585fc22ccdced51fd774a77ab507a61fa8f9d6f851d433"} Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.923446 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f151dd273dafbf61f585fc22ccdced51fd774a77ab507a61fa8f9d6f851d433" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.923532 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539290-npvbb" Mar 01 09:30:06 crc kubenswrapper[4792]: I0301 09:30:06.926512 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f95aafcd-79b6-4ece-b3e1-ee9ea32a2754","Type":"ContainerStarted","Data":"4eadc073525a6b5d36de1b41c86d60c694e2676d2f2e09432b3268b0c047e386"} Mar 01 09:30:07 crc kubenswrapper[4792]: I0301 09:30:07.386019 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539284-g5rbc"] Mar 01 09:30:07 crc kubenswrapper[4792]: I0301 09:30:07.394644 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539284-g5rbc"] Mar 01 09:30:07 crc kubenswrapper[4792]: I0301 09:30:07.418714 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1263f40a-23c7-4ab8-8ebc-7c697e2eacd6" path="/var/lib/kubelet/pods/1263f40a-23c7-4ab8-8ebc-7c697e2eacd6/volumes" Mar 01 09:30:07 crc kubenswrapper[4792]: I0301 09:30:07.934130 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f95aafcd-79b6-4ece-b3e1-ee9ea32a2754","Type":"ContainerStarted","Data":"7a5248b939415c8e8f78362b848bacdcd5e78b6f41deb4adf883e48ced4035df"} Mar 01 09:30:07 crc kubenswrapper[4792]: I0301 09:30:07.934280 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 01 09:30:07 crc kubenswrapper[4792]: I0301 09:30:07.953217 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.9531967569999997 podStartE2EDuration="2.953196757s" podCreationTimestamp="2026-03-01 09:30:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:30:07.94623533 +0000 UTC m=+1337.188114537" watchObservedRunningTime="2026-03-01 09:30:07.953196757 +0000 UTC m=+1337.195075954" Mar 01 09:30:11 crc kubenswrapper[4792]: I0301 09:30:11.354439 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 01 09:30:11 crc kubenswrapper[4792]: I0301 09:30:11.792500 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-8vfwt"] Mar 01 09:30:11 crc kubenswrapper[4792]: E0301 09:30:11.793131 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3644e57-7093-4402-a6f2-48ed10ac14fa" containerName="oc" Mar 01 09:30:11 crc kubenswrapper[4792]: I0301 09:30:11.793147 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3644e57-7093-4402-a6f2-48ed10ac14fa" containerName="oc" Mar 01 09:30:11 crc kubenswrapper[4792]: I0301 09:30:11.793331 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3644e57-7093-4402-a6f2-48ed10ac14fa" containerName="oc" Mar 01 09:30:11 crc kubenswrapper[4792]: I0301 09:30:11.793840 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-8vfwt" Mar 01 09:30:11 crc kubenswrapper[4792]: I0301 09:30:11.797553 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 01 09:30:11 crc kubenswrapper[4792]: I0301 09:30:11.797829 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 01 09:30:11 crc kubenswrapper[4792]: I0301 09:30:11.814703 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-8vfwt"] Mar 01 09:30:11 crc kubenswrapper[4792]: I0301 09:30:11.900672 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32a84376-7418-49cd-9c62-fdd1af7ec31b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-8vfwt\" (UID: \"32a84376-7418-49cd-9c62-fdd1af7ec31b\") " pod="openstack/nova-cell0-cell-mapping-8vfwt" Mar 01 09:30:11 crc kubenswrapper[4792]: I0301 09:30:11.900744 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l625\" (UniqueName: \"kubernetes.io/projected/32a84376-7418-49cd-9c62-fdd1af7ec31b-kube-api-access-8l625\") pod \"nova-cell0-cell-mapping-8vfwt\" (UID: \"32a84376-7418-49cd-9c62-fdd1af7ec31b\") " pod="openstack/nova-cell0-cell-mapping-8vfwt" Mar 01 09:30:11 crc kubenswrapper[4792]: I0301 09:30:11.900780 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32a84376-7418-49cd-9c62-fdd1af7ec31b-scripts\") pod \"nova-cell0-cell-mapping-8vfwt\" (UID: \"32a84376-7418-49cd-9c62-fdd1af7ec31b\") " pod="openstack/nova-cell0-cell-mapping-8vfwt" Mar 01 09:30:11 crc kubenswrapper[4792]: I0301 09:30:11.900811 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32a84376-7418-49cd-9c62-fdd1af7ec31b-config-data\") pod \"nova-cell0-cell-mapping-8vfwt\" (UID: \"32a84376-7418-49cd-9c62-fdd1af7ec31b\") " pod="openstack/nova-cell0-cell-mapping-8vfwt" Mar 01 09:30:11 crc kubenswrapper[4792]: I0301 09:30:11.935702 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:30:11 crc kubenswrapper[4792]: I0301 09:30:11.947001 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 01 09:30:11 crc kubenswrapper[4792]: I0301 09:30:11.949484 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 01 09:30:11 crc kubenswrapper[4792]: I0301 09:30:11.969600 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.001966 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwtn2\" (UniqueName: \"kubernetes.io/projected/892ffeb5-f853-45a6-9ad4-a2a40437a406-kube-api-access-gwtn2\") pod \"nova-metadata-0\" (UID: \"892ffeb5-f853-45a6-9ad4-a2a40437a406\") " pod="openstack/nova-metadata-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.002020 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/892ffeb5-f853-45a6-9ad4-a2a40437a406-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"892ffeb5-f853-45a6-9ad4-a2a40437a406\") " pod="openstack/nova-metadata-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.002068 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32a84376-7418-49cd-9c62-fdd1af7ec31b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-8vfwt\" (UID: \"32a84376-7418-49cd-9c62-fdd1af7ec31b\") " pod="openstack/nova-cell0-cell-mapping-8vfwt" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.002100 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/892ffeb5-f853-45a6-9ad4-a2a40437a406-logs\") pod \"nova-metadata-0\" (UID: \"892ffeb5-f853-45a6-9ad4-a2a40437a406\") " pod="openstack/nova-metadata-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.002131 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l625\" (UniqueName: \"kubernetes.io/projected/32a84376-7418-49cd-9c62-fdd1af7ec31b-kube-api-access-8l625\") pod \"nova-cell0-cell-mapping-8vfwt\" (UID: \"32a84376-7418-49cd-9c62-fdd1af7ec31b\") " pod="openstack/nova-cell0-cell-mapping-8vfwt" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.002161 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32a84376-7418-49cd-9c62-fdd1af7ec31b-scripts\") pod \"nova-cell0-cell-mapping-8vfwt\" (UID: \"32a84376-7418-49cd-9c62-fdd1af7ec31b\") " pod="openstack/nova-cell0-cell-mapping-8vfwt" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.003984 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32a84376-7418-49cd-9c62-fdd1af7ec31b-config-data\") pod \"nova-cell0-cell-mapping-8vfwt\" (UID: \"32a84376-7418-49cd-9c62-fdd1af7ec31b\") " pod="openstack/nova-cell0-cell-mapping-8vfwt" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.004029 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/892ffeb5-f853-45a6-9ad4-a2a40437a406-config-data\") pod \"nova-metadata-0\" (UID: \"892ffeb5-f853-45a6-9ad4-a2a40437a406\") " pod="openstack/nova-metadata-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.022285 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32a84376-7418-49cd-9c62-fdd1af7ec31b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-8vfwt\" (UID: \"32a84376-7418-49cd-9c62-fdd1af7ec31b\") " pod="openstack/nova-cell0-cell-mapping-8vfwt" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.031147 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32a84376-7418-49cd-9c62-fdd1af7ec31b-scripts\") pod \"nova-cell0-cell-mapping-8vfwt\" (UID: \"32a84376-7418-49cd-9c62-fdd1af7ec31b\") " pod="openstack/nova-cell0-cell-mapping-8vfwt" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.033007 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32a84376-7418-49cd-9c62-fdd1af7ec31b-config-data\") pod \"nova-cell0-cell-mapping-8vfwt\" (UID: \"32a84376-7418-49cd-9c62-fdd1af7ec31b\") " pod="openstack/nova-cell0-cell-mapping-8vfwt" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.045567 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.045746 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l625\" (UniqueName: \"kubernetes.io/projected/32a84376-7418-49cd-9c62-fdd1af7ec31b-kube-api-access-8l625\") pod \"nova-cell0-cell-mapping-8vfwt\" (UID: \"32a84376-7418-49cd-9c62-fdd1af7ec31b\") " pod="openstack/nova-cell0-cell-mapping-8vfwt" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.048354 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.058316 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.074459 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.109836 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49690073-1340-4686-bc4b-f69901bb45d9-config-data\") pod \"nova-api-0\" (UID: \"49690073-1340-4686-bc4b-f69901bb45d9\") " pod="openstack/nova-api-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.109927 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvskn\" (UniqueName: \"kubernetes.io/projected/49690073-1340-4686-bc4b-f69901bb45d9-kube-api-access-dvskn\") pod \"nova-api-0\" (UID: \"49690073-1340-4686-bc4b-f69901bb45d9\") " pod="openstack/nova-api-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.109970 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/892ffeb5-f853-45a6-9ad4-a2a40437a406-logs\") pod \"nova-metadata-0\" (UID: \"892ffeb5-f853-45a6-9ad4-a2a40437a406\") " pod="openstack/nova-metadata-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.110000 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49690073-1340-4686-bc4b-f69901bb45d9-logs\") pod \"nova-api-0\" (UID: \"49690073-1340-4686-bc4b-f69901bb45d9\") " pod="openstack/nova-api-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.110072 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/892ffeb5-f853-45a6-9ad4-a2a40437a406-config-data\") pod \"nova-metadata-0\" (UID: \"892ffeb5-f853-45a6-9ad4-a2a40437a406\") " pod="openstack/nova-metadata-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.110097 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49690073-1340-4686-bc4b-f69901bb45d9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"49690073-1340-4686-bc4b-f69901bb45d9\") " pod="openstack/nova-api-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.110163 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwtn2\" (UniqueName: \"kubernetes.io/projected/892ffeb5-f853-45a6-9ad4-a2a40437a406-kube-api-access-gwtn2\") pod \"nova-metadata-0\" (UID: \"892ffeb5-f853-45a6-9ad4-a2a40437a406\") " pod="openstack/nova-metadata-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.110189 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/892ffeb5-f853-45a6-9ad4-a2a40437a406-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"892ffeb5-f853-45a6-9ad4-a2a40437a406\") " pod="openstack/nova-metadata-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.111128 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/892ffeb5-f853-45a6-9ad4-a2a40437a406-logs\") pod \"nova-metadata-0\" (UID: \"892ffeb5-f853-45a6-9ad4-a2a40437a406\") " pod="openstack/nova-metadata-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.116111 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-8vfwt" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.160747 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/892ffeb5-f853-45a6-9ad4-a2a40437a406-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"892ffeb5-f853-45a6-9ad4-a2a40437a406\") " pod="openstack/nova-metadata-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.162026 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/892ffeb5-f853-45a6-9ad4-a2a40437a406-config-data\") pod \"nova-metadata-0\" (UID: \"892ffeb5-f853-45a6-9ad4-a2a40437a406\") " pod="openstack/nova-metadata-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.213361 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49690073-1340-4686-bc4b-f69901bb45d9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"49690073-1340-4686-bc4b-f69901bb45d9\") " pod="openstack/nova-api-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.213706 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49690073-1340-4686-bc4b-f69901bb45d9-config-data\") pod \"nova-api-0\" (UID: \"49690073-1340-4686-bc4b-f69901bb45d9\") " pod="openstack/nova-api-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.213761 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvskn\" (UniqueName: \"kubernetes.io/projected/49690073-1340-4686-bc4b-f69901bb45d9-kube-api-access-dvskn\") pod \"nova-api-0\" (UID: \"49690073-1340-4686-bc4b-f69901bb45d9\") " pod="openstack/nova-api-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.213796 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49690073-1340-4686-bc4b-f69901bb45d9-logs\") pod \"nova-api-0\" (UID: \"49690073-1340-4686-bc4b-f69901bb45d9\") " pod="openstack/nova-api-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.214502 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49690073-1340-4686-bc4b-f69901bb45d9-logs\") pod \"nova-api-0\" (UID: \"49690073-1340-4686-bc4b-f69901bb45d9\") " pod="openstack/nova-api-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.229221 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwtn2\" (UniqueName: \"kubernetes.io/projected/892ffeb5-f853-45a6-9ad4-a2a40437a406-kube-api-access-gwtn2\") pod \"nova-metadata-0\" (UID: \"892ffeb5-f853-45a6-9ad4-a2a40437a406\") " pod="openstack/nova-metadata-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.230629 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49690073-1340-4686-bc4b-f69901bb45d9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"49690073-1340-4686-bc4b-f69901bb45d9\") " pod="openstack/nova-api-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.235766 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49690073-1340-4686-bc4b-f69901bb45d9-config-data\") pod \"nova-api-0\" (UID: \"49690073-1340-4686-bc4b-f69901bb45d9\") " pod="openstack/nova-api-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.266639 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.267391 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvskn\" (UniqueName: \"kubernetes.io/projected/49690073-1340-4686-bc4b-f69901bb45d9-kube-api-access-dvskn\") pod \"nova-api-0\" (UID: \"49690073-1340-4686-bc4b-f69901bb45d9\") " pod="openstack/nova-api-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.285315 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.286515 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.302529 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.311715 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.348788 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.380257 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw"] Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.382898 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.404041 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw"] Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.420091 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.421397 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.423789 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.424651 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-dns-svc\") pod \"dnsmasq-dns-7ff5b4cd7c-d6nmw\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.424682 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-config\") pod \"dnsmasq-dns-7ff5b4cd7c-d6nmw\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.424701 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b24b4c8-4f85-4eae-93c7-3249c1a54f09-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2b24b4c8-4f85-4eae-93c7-3249c1a54f09\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.424725 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8015a6e-cf5d-4728-b2e6-66bb8960fd40-config-data\") pod \"nova-scheduler-0\" (UID: \"b8015a6e-cf5d-4728-b2e6-66bb8960fd40\") " pod="openstack/nova-scheduler-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.424748 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwdw9\" (UniqueName: \"kubernetes.io/projected/b8015a6e-cf5d-4728-b2e6-66bb8960fd40-kube-api-access-pwdw9\") pod \"nova-scheduler-0\" (UID: \"b8015a6e-cf5d-4728-b2e6-66bb8960fd40\") " pod="openstack/nova-scheduler-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.424780 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b24b4c8-4f85-4eae-93c7-3249c1a54f09-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2b24b4c8-4f85-4eae-93c7-3249c1a54f09\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.424812 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s2hb\" (UniqueName: \"kubernetes.io/projected/2b24b4c8-4f85-4eae-93c7-3249c1a54f09-kube-api-access-5s2hb\") pod \"nova-cell1-novncproxy-0\" (UID: \"2b24b4c8-4f85-4eae-93c7-3249c1a54f09\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.424850 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8015a6e-cf5d-4728-b2e6-66bb8960fd40-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b8015a6e-cf5d-4728-b2e6-66bb8960fd40\") " pod="openstack/nova-scheduler-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.424934 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5b4cd7c-d6nmw\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.424969 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhfgl\" (UniqueName: \"kubernetes.io/projected/dae0c901-5f9c-4248-96dd-08acb2b5d278-kube-api-access-rhfgl\") pod \"dnsmasq-dns-7ff5b4cd7c-d6nmw\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.424992 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5b4cd7c-d6nmw\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.445425 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.526761 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b24b4c8-4f85-4eae-93c7-3249c1a54f09-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2b24b4c8-4f85-4eae-93c7-3249c1a54f09\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.526846 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s2hb\" (UniqueName: \"kubernetes.io/projected/2b24b4c8-4f85-4eae-93c7-3249c1a54f09-kube-api-access-5s2hb\") pod \"nova-cell1-novncproxy-0\" (UID: \"2b24b4c8-4f85-4eae-93c7-3249c1a54f09\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.526929 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8015a6e-cf5d-4728-b2e6-66bb8960fd40-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b8015a6e-cf5d-4728-b2e6-66bb8960fd40\") " pod="openstack/nova-scheduler-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.527016 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5b4cd7c-d6nmw\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.527051 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhfgl\" (UniqueName: \"kubernetes.io/projected/dae0c901-5f9c-4248-96dd-08acb2b5d278-kube-api-access-rhfgl\") pod \"dnsmasq-dns-7ff5b4cd7c-d6nmw\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.527100 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5b4cd7c-d6nmw\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.527187 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-dns-svc\") pod \"dnsmasq-dns-7ff5b4cd7c-d6nmw\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.527241 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-config\") pod \"dnsmasq-dns-7ff5b4cd7c-d6nmw\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.527265 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b24b4c8-4f85-4eae-93c7-3249c1a54f09-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2b24b4c8-4f85-4eae-93c7-3249c1a54f09\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.527295 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8015a6e-cf5d-4728-b2e6-66bb8960fd40-config-data\") pod \"nova-scheduler-0\" (UID: \"b8015a6e-cf5d-4728-b2e6-66bb8960fd40\") " pod="openstack/nova-scheduler-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.527350 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwdw9\" (UniqueName: \"kubernetes.io/projected/b8015a6e-cf5d-4728-b2e6-66bb8960fd40-kube-api-access-pwdw9\") pod \"nova-scheduler-0\" (UID: \"b8015a6e-cf5d-4728-b2e6-66bb8960fd40\") " pod="openstack/nova-scheduler-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.528250 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5b4cd7c-d6nmw\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.529015 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5b4cd7c-d6nmw\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.533646 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-config\") pod \"dnsmasq-dns-7ff5b4cd7c-d6nmw\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.529852 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-dns-svc\") pod \"dnsmasq-dns-7ff5b4cd7c-d6nmw\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.552742 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhfgl\" (UniqueName: \"kubernetes.io/projected/dae0c901-5f9c-4248-96dd-08acb2b5d278-kube-api-access-rhfgl\") pod \"dnsmasq-dns-7ff5b4cd7c-d6nmw\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.556091 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s2hb\" (UniqueName: \"kubernetes.io/projected/2b24b4c8-4f85-4eae-93c7-3249c1a54f09-kube-api-access-5s2hb\") pod \"nova-cell1-novncproxy-0\" (UID: \"2b24b4c8-4f85-4eae-93c7-3249c1a54f09\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.561131 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b24b4c8-4f85-4eae-93c7-3249c1a54f09-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2b24b4c8-4f85-4eae-93c7-3249c1a54f09\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.561380 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b24b4c8-4f85-4eae-93c7-3249c1a54f09-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2b24b4c8-4f85-4eae-93c7-3249c1a54f09\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.563509 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8015a6e-cf5d-4728-b2e6-66bb8960fd40-config-data\") pod \"nova-scheduler-0\" (UID: \"b8015a6e-cf5d-4728-b2e6-66bb8960fd40\") " pod="openstack/nova-scheduler-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.563880 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwdw9\" (UniqueName: \"kubernetes.io/projected/b8015a6e-cf5d-4728-b2e6-66bb8960fd40-kube-api-access-pwdw9\") pod \"nova-scheduler-0\" (UID: \"b8015a6e-cf5d-4728-b2e6-66bb8960fd40\") " pod="openstack/nova-scheduler-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.573097 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8015a6e-cf5d-4728-b2e6-66bb8960fd40-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b8015a6e-cf5d-4728-b2e6-66bb8960fd40\") " pod="openstack/nova-scheduler-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.653448 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.713589 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.749569 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.754700 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-8vfwt"] Mar 01 09:30:12 crc kubenswrapper[4792]: I0301 09:30:12.953174 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.015164 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"49690073-1340-4686-bc4b-f69901bb45d9","Type":"ContainerStarted","Data":"527b5d3ab3c0911a39430a25a4440c1f01d3f9995da50cf30f22affc0352f613"} Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.018019 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-8vfwt" event={"ID":"32a84376-7418-49cd-9c62-fdd1af7ec31b","Type":"ContainerStarted","Data":"3c68bb44e29a7b14a923b915dc00892bf89acf28bc6734efebeb0e2ee1c07a61"} Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.108666 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.224578 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tjd85"] Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.226392 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tjd85" Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.233893 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.239938 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.249554 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-config-data\") pod \"nova-cell1-conductor-db-sync-tjd85\" (UID: \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\") " pod="openstack/nova-cell1-conductor-db-sync-tjd85" Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.249601 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5qnd\" (UniqueName: \"kubernetes.io/projected/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-kube-api-access-l5qnd\") pod \"nova-cell1-conductor-db-sync-tjd85\" (UID: \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\") " pod="openstack/nova-cell1-conductor-db-sync-tjd85" Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.249722 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-scripts\") pod \"nova-cell1-conductor-db-sync-tjd85\" (UID: \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\") " pod="openstack/nova-cell1-conductor-db-sync-tjd85" Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.249765 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-tjd85\" (UID: \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\") " pod="openstack/nova-cell1-conductor-db-sync-tjd85" Mar 01 09:30:13 crc kubenswrapper[4792]: W0301 09:30:13.262378 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8015a6e_cf5d_4728_b2e6_66bb8960fd40.slice/crio-8d6bd74bfd2c811ba8feb074f6cd879de61baffea6857711abd8ac6646b3af71 WatchSource:0}: Error finding container 8d6bd74bfd2c811ba8feb074f6cd879de61baffea6857711abd8ac6646b3af71: Status 404 returned error can't find the container with id 8d6bd74bfd2c811ba8feb074f6cd879de61baffea6857711abd8ac6646b3af71 Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.262921 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tjd85"] Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.277505 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.352939 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-scripts\") pod \"nova-cell1-conductor-db-sync-tjd85\" (UID: \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\") " pod="openstack/nova-cell1-conductor-db-sync-tjd85" Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.353004 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-tjd85\" (UID: \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\") " pod="openstack/nova-cell1-conductor-db-sync-tjd85" Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.353049 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-config-data\") pod \"nova-cell1-conductor-db-sync-tjd85\" (UID: \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\") " pod="openstack/nova-cell1-conductor-db-sync-tjd85" Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.353085 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5qnd\" (UniqueName: \"kubernetes.io/projected/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-kube-api-access-l5qnd\") pod \"nova-cell1-conductor-db-sync-tjd85\" (UID: \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\") " pod="openstack/nova-cell1-conductor-db-sync-tjd85" Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.377117 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-scripts\") pod \"nova-cell1-conductor-db-sync-tjd85\" (UID: \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\") " pod="openstack/nova-cell1-conductor-db-sync-tjd85" Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.377710 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-tjd85\" (UID: \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\") " pod="openstack/nova-cell1-conductor-db-sync-tjd85" Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.380383 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5qnd\" (UniqueName: \"kubernetes.io/projected/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-kube-api-access-l5qnd\") pod \"nova-cell1-conductor-db-sync-tjd85\" (UID: \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\") " pod="openstack/nova-cell1-conductor-db-sync-tjd85" Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.386799 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-config-data\") pod \"nova-cell1-conductor-db-sync-tjd85\" (UID: \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\") " pod="openstack/nova-cell1-conductor-db-sync-tjd85" Mar 01 09:30:13 crc kubenswrapper[4792]: W0301 09:30:13.425873 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddae0c901_5f9c_4248_96dd_08acb2b5d278.slice/crio-c9bb9d1c4be64e9583f3d8ffde433440cf4e39ab72b4447ef9f1e98ba96850ca WatchSource:0}: Error finding container c9bb9d1c4be64e9583f3d8ffde433440cf4e39ab72b4447ef9f1e98ba96850ca: Status 404 returned error can't find the container with id c9bb9d1c4be64e9583f3d8ffde433440cf4e39ab72b4447ef9f1e98ba96850ca Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.436899 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw"] Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.566954 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 01 09:30:13 crc kubenswrapper[4792]: I0301 09:30:13.600730 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tjd85" Mar 01 09:30:14 crc kubenswrapper[4792]: I0301 09:30:14.064365 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-8vfwt" event={"ID":"32a84376-7418-49cd-9c62-fdd1af7ec31b","Type":"ContainerStarted","Data":"cc49a2b9acb35bd4588c6c9cb6d10085e66d67e1476ad98c515c87fcc8a40be2"} Mar 01 09:30:14 crc kubenswrapper[4792]: I0301 09:30:14.069032 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2b24b4c8-4f85-4eae-93c7-3249c1a54f09","Type":"ContainerStarted","Data":"666f673dbf785249e4b855230a831650c499d51cb9413297a868fb5bc1afca52"} Mar 01 09:30:14 crc kubenswrapper[4792]: I0301 09:30:14.077542 4792 generic.go:334] "Generic (PLEG): container finished" podID="dae0c901-5f9c-4248-96dd-08acb2b5d278" containerID="4f80ddb9167cc5bebd3ccdc43bf19f478b728967aa30e43898dc24927a2246f9" exitCode=0 Mar 01 09:30:14 crc kubenswrapper[4792]: I0301 09:30:14.077621 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" event={"ID":"dae0c901-5f9c-4248-96dd-08acb2b5d278","Type":"ContainerDied","Data":"4f80ddb9167cc5bebd3ccdc43bf19f478b728967aa30e43898dc24927a2246f9"} Mar 01 09:30:14 crc kubenswrapper[4792]: I0301 09:30:14.077652 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" event={"ID":"dae0c901-5f9c-4248-96dd-08acb2b5d278","Type":"ContainerStarted","Data":"c9bb9d1c4be64e9583f3d8ffde433440cf4e39ab72b4447ef9f1e98ba96850ca"} Mar 01 09:30:14 crc kubenswrapper[4792]: I0301 09:30:14.082508 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"892ffeb5-f853-45a6-9ad4-a2a40437a406","Type":"ContainerStarted","Data":"e1ddfe00df757587e10aacff0143490ad9255e7b1f73da1f009a37752ae1c648"} Mar 01 09:30:14 crc kubenswrapper[4792]: I0301 09:30:14.084303 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b8015a6e-cf5d-4728-b2e6-66bb8960fd40","Type":"ContainerStarted","Data":"8d6bd74bfd2c811ba8feb074f6cd879de61baffea6857711abd8ac6646b3af71"} Mar 01 09:30:14 crc kubenswrapper[4792]: I0301 09:30:14.097255 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-8vfwt" podStartSLOduration=3.097238198 podStartE2EDuration="3.097238198s" podCreationTimestamp="2026-03-01 09:30:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:30:14.090350443 +0000 UTC m=+1343.332229640" watchObservedRunningTime="2026-03-01 09:30:14.097238198 +0000 UTC m=+1343.339117395" Mar 01 09:30:14 crc kubenswrapper[4792]: I0301 09:30:14.172030 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tjd85"] Mar 01 09:30:14 crc kubenswrapper[4792]: W0301 09:30:14.242979 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7269b8b7_440f_4fae_b0f1_f624e9d5b29a.slice/crio-c0f91939f7bb3b7f30d5eba3a1ceca8589d9bda852653f48b1879bd42cead480 WatchSource:0}: Error finding container c0f91939f7bb3b7f30d5eba3a1ceca8589d9bda852653f48b1879bd42cead480: Status 404 returned error can't find the container with id c0f91939f7bb3b7f30d5eba3a1ceca8589d9bda852653f48b1879bd42cead480 Mar 01 09:30:15 crc kubenswrapper[4792]: I0301 09:30:15.102569 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" event={"ID":"dae0c901-5f9c-4248-96dd-08acb2b5d278","Type":"ContainerStarted","Data":"36ef0d467aa78ea6675158c30915ca7e3bff42a879c627274dd90d0498f741fd"} Mar 01 09:30:15 crc kubenswrapper[4792]: I0301 09:30:15.103306 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:15 crc kubenswrapper[4792]: I0301 09:30:15.107175 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tjd85" event={"ID":"7269b8b7-440f-4fae-b0f1-f624e9d5b29a","Type":"ContainerStarted","Data":"7b120b9d05aec1bbdb715a0cec1430208c27b75e09d6308e245c67d773de0e22"} Mar 01 09:30:15 crc kubenswrapper[4792]: I0301 09:30:15.107201 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tjd85" event={"ID":"7269b8b7-440f-4fae-b0f1-f624e9d5b29a","Type":"ContainerStarted","Data":"c0f91939f7bb3b7f30d5eba3a1ceca8589d9bda852653f48b1879bd42cead480"} Mar 01 09:30:15 crc kubenswrapper[4792]: I0301 09:30:15.127943 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" podStartSLOduration=3.127920062 podStartE2EDuration="3.127920062s" podCreationTimestamp="2026-03-01 09:30:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:30:15.122219465 +0000 UTC m=+1344.364098662" watchObservedRunningTime="2026-03-01 09:30:15.127920062 +0000 UTC m=+1344.369799269" Mar 01 09:30:15 crc kubenswrapper[4792]: I0301 09:30:15.146536 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-tjd85" podStartSLOduration=2.146518858 podStartE2EDuration="2.146518858s" podCreationTimestamp="2026-03-01 09:30:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:30:15.13952051 +0000 UTC m=+1344.381399707" watchObservedRunningTime="2026-03-01 09:30:15.146518858 +0000 UTC m=+1344.388398055" Mar 01 09:30:16 crc kubenswrapper[4792]: I0301 09:30:16.067376 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 01 09:30:16 crc kubenswrapper[4792]: I0301 09:30:16.100581 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.134370 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"49690073-1340-4686-bc4b-f69901bb45d9","Type":"ContainerStarted","Data":"e37d4adb8d9d5ffbf813442c1fdbcfbbf0812ab4b1f3d2f82a5ad52ffcbb0ef7"} Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.134862 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"49690073-1340-4686-bc4b-f69901bb45d9","Type":"ContainerStarted","Data":"0a8da701899d6dbb39ad32c67cda198a10066ca5649f3a3b10692182aafa59b8"} Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.135726 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b8015a6e-cf5d-4728-b2e6-66bb8960fd40","Type":"ContainerStarted","Data":"43517df943397eb0f37738ca82db7645bf4ef1a52cc7a3b9599b08d521c29e03"} Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.137813 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2b24b4c8-4f85-4eae-93c7-3249c1a54f09","Type":"ContainerStarted","Data":"b93abbe99982a2c4da73af61051b2d1996a55ea6924dc1c336ede901534900b5"} Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.137836 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="2b24b4c8-4f85-4eae-93c7-3249c1a54f09" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://b93abbe99982a2c4da73af61051b2d1996a55ea6924dc1c336ede901534900b5" gracePeriod=30 Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.141126 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"892ffeb5-f853-45a6-9ad4-a2a40437a406","Type":"ContainerStarted","Data":"0f9e913b971c9134bc2136ec2762bd10c403ebea92fa22f81cb94880f84799fa"} Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.141157 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"892ffeb5-f853-45a6-9ad4-a2a40437a406","Type":"ContainerStarted","Data":"93888e1a70b199ccd8eb105633970b355749358a35a6a8da18a1130d74ff49a3"} Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.141273 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="892ffeb5-f853-45a6-9ad4-a2a40437a406" containerName="nova-metadata-log" containerID="cri-o://93888e1a70b199ccd8eb105633970b355749358a35a6a8da18a1130d74ff49a3" gracePeriod=30 Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.141306 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="892ffeb5-f853-45a6-9ad4-a2a40437a406" containerName="nova-metadata-metadata" containerID="cri-o://0f9e913b971c9134bc2136ec2762bd10c403ebea92fa22f81cb94880f84799fa" gracePeriod=30 Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.163529 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.119655806 podStartE2EDuration="7.163510509s" podCreationTimestamp="2026-03-01 09:30:11 +0000 UTC" firstStartedPulling="2026-03-01 09:30:12.979806444 +0000 UTC m=+1342.221685641" lastFinishedPulling="2026-03-01 09:30:17.023661147 +0000 UTC m=+1346.265540344" observedRunningTime="2026-03-01 09:30:18.158322235 +0000 UTC m=+1347.400201432" watchObservedRunningTime="2026-03-01 09:30:18.163510509 +0000 UTC m=+1347.405389706" Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.178681 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.418364087 podStartE2EDuration="6.178663202s" podCreationTimestamp="2026-03-01 09:30:12 +0000 UTC" firstStartedPulling="2026-03-01 09:30:13.264747036 +0000 UTC m=+1342.506626233" lastFinishedPulling="2026-03-01 09:30:17.025046141 +0000 UTC m=+1346.266925348" observedRunningTime="2026-03-01 09:30:18.173885248 +0000 UTC m=+1347.415764445" watchObservedRunningTime="2026-03-01 09:30:18.178663202 +0000 UTC m=+1347.420542399" Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.201879 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.753811261 podStartE2EDuration="6.201851828s" podCreationTimestamp="2026-03-01 09:30:12 +0000 UTC" firstStartedPulling="2026-03-01 09:30:13.582134507 +0000 UTC m=+1342.824013694" lastFinishedPulling="2026-03-01 09:30:17.030175064 +0000 UTC m=+1346.272054261" observedRunningTime="2026-03-01 09:30:18.18774829 +0000 UTC m=+1347.429627487" watchObservedRunningTime="2026-03-01 09:30:18.201851828 +0000 UTC m=+1347.443731025" Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.217131 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.312282314 podStartE2EDuration="7.217111884s" podCreationTimestamp="2026-03-01 09:30:11 +0000 UTC" firstStartedPulling="2026-03-01 09:30:13.123394437 +0000 UTC m=+1342.365273634" lastFinishedPulling="2026-03-01 09:30:17.028224007 +0000 UTC m=+1346.270103204" observedRunningTime="2026-03-01 09:30:18.208450337 +0000 UTC m=+1347.450329534" watchObservedRunningTime="2026-03-01 09:30:18.217111884 +0000 UTC m=+1347.458991081" Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.768191 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.946969 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwtn2\" (UniqueName: \"kubernetes.io/projected/892ffeb5-f853-45a6-9ad4-a2a40437a406-kube-api-access-gwtn2\") pod \"892ffeb5-f853-45a6-9ad4-a2a40437a406\" (UID: \"892ffeb5-f853-45a6-9ad4-a2a40437a406\") " Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.947283 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/892ffeb5-f853-45a6-9ad4-a2a40437a406-config-data\") pod \"892ffeb5-f853-45a6-9ad4-a2a40437a406\" (UID: \"892ffeb5-f853-45a6-9ad4-a2a40437a406\") " Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.947377 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/892ffeb5-f853-45a6-9ad4-a2a40437a406-logs\") pod \"892ffeb5-f853-45a6-9ad4-a2a40437a406\" (UID: \"892ffeb5-f853-45a6-9ad4-a2a40437a406\") " Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.947444 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/892ffeb5-f853-45a6-9ad4-a2a40437a406-combined-ca-bundle\") pod \"892ffeb5-f853-45a6-9ad4-a2a40437a406\" (UID: \"892ffeb5-f853-45a6-9ad4-a2a40437a406\") " Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.947716 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/892ffeb5-f853-45a6-9ad4-a2a40437a406-logs" (OuterVolumeSpecName: "logs") pod "892ffeb5-f853-45a6-9ad4-a2a40437a406" (UID: "892ffeb5-f853-45a6-9ad4-a2a40437a406"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.948156 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/892ffeb5-f853-45a6-9ad4-a2a40437a406-logs\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.954544 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/892ffeb5-f853-45a6-9ad4-a2a40437a406-kube-api-access-gwtn2" (OuterVolumeSpecName: "kube-api-access-gwtn2") pod "892ffeb5-f853-45a6-9ad4-a2a40437a406" (UID: "892ffeb5-f853-45a6-9ad4-a2a40437a406"). InnerVolumeSpecName "kube-api-access-gwtn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.976300 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/892ffeb5-f853-45a6-9ad4-a2a40437a406-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "892ffeb5-f853-45a6-9ad4-a2a40437a406" (UID: "892ffeb5-f853-45a6-9ad4-a2a40437a406"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:18 crc kubenswrapper[4792]: I0301 09:30:18.979418 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/892ffeb5-f853-45a6-9ad4-a2a40437a406-config-data" (OuterVolumeSpecName: "config-data") pod "892ffeb5-f853-45a6-9ad4-a2a40437a406" (UID: "892ffeb5-f853-45a6-9ad4-a2a40437a406"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.050110 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/892ffeb5-f853-45a6-9ad4-a2a40437a406-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.050156 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwtn2\" (UniqueName: \"kubernetes.io/projected/892ffeb5-f853-45a6-9ad4-a2a40437a406-kube-api-access-gwtn2\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.050172 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/892ffeb5-f853-45a6-9ad4-a2a40437a406-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.152514 4792 generic.go:334] "Generic (PLEG): container finished" podID="892ffeb5-f853-45a6-9ad4-a2a40437a406" containerID="0f9e913b971c9134bc2136ec2762bd10c403ebea92fa22f81cb94880f84799fa" exitCode=0 Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.152814 4792 generic.go:334] "Generic (PLEG): container finished" podID="892ffeb5-f853-45a6-9ad4-a2a40437a406" containerID="93888e1a70b199ccd8eb105633970b355749358a35a6a8da18a1130d74ff49a3" exitCode=143 Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.152696 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"892ffeb5-f853-45a6-9ad4-a2a40437a406","Type":"ContainerDied","Data":"0f9e913b971c9134bc2136ec2762bd10c403ebea92fa22f81cb94880f84799fa"} Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.152975 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"892ffeb5-f853-45a6-9ad4-a2a40437a406","Type":"ContainerDied","Data":"93888e1a70b199ccd8eb105633970b355749358a35a6a8da18a1130d74ff49a3"} Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.152994 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"892ffeb5-f853-45a6-9ad4-a2a40437a406","Type":"ContainerDied","Data":"e1ddfe00df757587e10aacff0143490ad9255e7b1f73da1f009a37752ae1c648"} Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.153012 4792 scope.go:117] "RemoveContainer" containerID="0f9e913b971c9134bc2136ec2762bd10c403ebea92fa22f81cb94880f84799fa" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.152668 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.185706 4792 scope.go:117] "RemoveContainer" containerID="93888e1a70b199ccd8eb105633970b355749358a35a6a8da18a1130d74ff49a3" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.206208 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.225128 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.225548 4792 scope.go:117] "RemoveContainer" containerID="0f9e913b971c9134bc2136ec2762bd10c403ebea92fa22f81cb94880f84799fa" Mar 01 09:30:19 crc kubenswrapper[4792]: E0301 09:30:19.226165 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f9e913b971c9134bc2136ec2762bd10c403ebea92fa22f81cb94880f84799fa\": container with ID starting with 0f9e913b971c9134bc2136ec2762bd10c403ebea92fa22f81cb94880f84799fa not found: ID does not exist" containerID="0f9e913b971c9134bc2136ec2762bd10c403ebea92fa22f81cb94880f84799fa" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.226203 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f9e913b971c9134bc2136ec2762bd10c403ebea92fa22f81cb94880f84799fa"} err="failed to get container status \"0f9e913b971c9134bc2136ec2762bd10c403ebea92fa22f81cb94880f84799fa\": rpc error: code = NotFound desc = could not find container \"0f9e913b971c9134bc2136ec2762bd10c403ebea92fa22f81cb94880f84799fa\": container with ID starting with 0f9e913b971c9134bc2136ec2762bd10c403ebea92fa22f81cb94880f84799fa not found: ID does not exist" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.226230 4792 scope.go:117] "RemoveContainer" containerID="93888e1a70b199ccd8eb105633970b355749358a35a6a8da18a1130d74ff49a3" Mar 01 09:30:19 crc kubenswrapper[4792]: E0301 09:30:19.226523 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93888e1a70b199ccd8eb105633970b355749358a35a6a8da18a1130d74ff49a3\": container with ID starting with 93888e1a70b199ccd8eb105633970b355749358a35a6a8da18a1130d74ff49a3 not found: ID does not exist" containerID="93888e1a70b199ccd8eb105633970b355749358a35a6a8da18a1130d74ff49a3" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.226552 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93888e1a70b199ccd8eb105633970b355749358a35a6a8da18a1130d74ff49a3"} err="failed to get container status \"93888e1a70b199ccd8eb105633970b355749358a35a6a8da18a1130d74ff49a3\": rpc error: code = NotFound desc = could not find container \"93888e1a70b199ccd8eb105633970b355749358a35a6a8da18a1130d74ff49a3\": container with ID starting with 93888e1a70b199ccd8eb105633970b355749358a35a6a8da18a1130d74ff49a3 not found: ID does not exist" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.226570 4792 scope.go:117] "RemoveContainer" containerID="0f9e913b971c9134bc2136ec2762bd10c403ebea92fa22f81cb94880f84799fa" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.226791 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f9e913b971c9134bc2136ec2762bd10c403ebea92fa22f81cb94880f84799fa"} err="failed to get container status \"0f9e913b971c9134bc2136ec2762bd10c403ebea92fa22f81cb94880f84799fa\": rpc error: code = NotFound desc = could not find container \"0f9e913b971c9134bc2136ec2762bd10c403ebea92fa22f81cb94880f84799fa\": container with ID starting with 0f9e913b971c9134bc2136ec2762bd10c403ebea92fa22f81cb94880f84799fa not found: ID does not exist" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.226823 4792 scope.go:117] "RemoveContainer" containerID="93888e1a70b199ccd8eb105633970b355749358a35a6a8da18a1130d74ff49a3" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.227355 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93888e1a70b199ccd8eb105633970b355749358a35a6a8da18a1130d74ff49a3"} err="failed to get container status \"93888e1a70b199ccd8eb105633970b355749358a35a6a8da18a1130d74ff49a3\": rpc error: code = NotFound desc = could not find container \"93888e1a70b199ccd8eb105633970b355749358a35a6a8da18a1130d74ff49a3\": container with ID starting with 93888e1a70b199ccd8eb105633970b355749358a35a6a8da18a1130d74ff49a3 not found: ID does not exist" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.239019 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:30:19 crc kubenswrapper[4792]: E0301 09:30:19.239427 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="892ffeb5-f853-45a6-9ad4-a2a40437a406" containerName="nova-metadata-log" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.239443 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="892ffeb5-f853-45a6-9ad4-a2a40437a406" containerName="nova-metadata-log" Mar 01 09:30:19 crc kubenswrapper[4792]: E0301 09:30:19.239458 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="892ffeb5-f853-45a6-9ad4-a2a40437a406" containerName="nova-metadata-metadata" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.239465 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="892ffeb5-f853-45a6-9ad4-a2a40437a406" containerName="nova-metadata-metadata" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.239640 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="892ffeb5-f853-45a6-9ad4-a2a40437a406" containerName="nova-metadata-metadata" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.239660 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="892ffeb5-f853-45a6-9ad4-a2a40437a406" containerName="nova-metadata-log" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.240519 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.249637 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.285745 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bshv\" (UniqueName: \"kubernetes.io/projected/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-kube-api-access-8bshv\") pod \"nova-metadata-0\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " pod="openstack/nova-metadata-0" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.286087 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-logs\") pod \"nova-metadata-0\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " pod="openstack/nova-metadata-0" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.286144 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " pod="openstack/nova-metadata-0" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.286223 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-config-data\") pod \"nova-metadata-0\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " pod="openstack/nova-metadata-0" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.286267 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " pod="openstack/nova-metadata-0" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.288946 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.289403 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.388329 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-logs\") pod \"nova-metadata-0\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " pod="openstack/nova-metadata-0" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.388412 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " pod="openstack/nova-metadata-0" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.388507 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-config-data\") pod \"nova-metadata-0\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " pod="openstack/nova-metadata-0" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.388553 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " pod="openstack/nova-metadata-0" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.388599 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bshv\" (UniqueName: \"kubernetes.io/projected/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-kube-api-access-8bshv\") pod \"nova-metadata-0\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " pod="openstack/nova-metadata-0" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.388834 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-logs\") pod \"nova-metadata-0\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " pod="openstack/nova-metadata-0" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.394178 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " pod="openstack/nova-metadata-0" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.402649 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-config-data\") pod \"nova-metadata-0\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " pod="openstack/nova-metadata-0" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.409442 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " pod="openstack/nova-metadata-0" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.409527 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bshv\" (UniqueName: \"kubernetes.io/projected/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-kube-api-access-8bshv\") pod \"nova-metadata-0\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " pod="openstack/nova-metadata-0" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.420002 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="892ffeb5-f853-45a6-9ad4-a2a40437a406" path="/var/lib/kubelet/pods/892ffeb5-f853-45a6-9ad4-a2a40437a406/volumes" Mar 01 09:30:19 crc kubenswrapper[4792]: I0301 09:30:19.604979 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 01 09:30:20 crc kubenswrapper[4792]: I0301 09:30:20.053427 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:30:20 crc kubenswrapper[4792]: W0301 09:30:20.061030 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0521e997_0d3e_4e56_9302_cbd2c79e2c0a.slice/crio-327e848ad4f5537716892bd8c75c4caf7104a2e6089d9a069750a5ed9373fae4 WatchSource:0}: Error finding container 327e848ad4f5537716892bd8c75c4caf7104a2e6089d9a069750a5ed9373fae4: Status 404 returned error can't find the container with id 327e848ad4f5537716892bd8c75c4caf7104a2e6089d9a069750a5ed9373fae4 Mar 01 09:30:20 crc kubenswrapper[4792]: I0301 09:30:20.180045 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0521e997-0d3e-4e56-9302-cbd2c79e2c0a","Type":"ContainerStarted","Data":"327e848ad4f5537716892bd8c75c4caf7104a2e6089d9a069750a5ed9373fae4"} Mar 01 09:30:21 crc kubenswrapper[4792]: I0301 09:30:21.190668 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0521e997-0d3e-4e56-9302-cbd2c79e2c0a","Type":"ContainerStarted","Data":"f6d95cb3f5287f3e4b2a3379b854f7695bc69ab2f663fff0b92f66887b19c4ff"} Mar 01 09:30:21 crc kubenswrapper[4792]: I0301 09:30:21.190940 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0521e997-0d3e-4e56-9302-cbd2c79e2c0a","Type":"ContainerStarted","Data":"08fff41a871a4698232ae6125185f351896691d6f802d9111c8ea965e8178547"} Mar 01 09:30:21 crc kubenswrapper[4792]: I0301 09:30:21.218601 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.218579633 podStartE2EDuration="2.218579633s" podCreationTimestamp="2026-03-01 09:30:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:30:21.20676514 +0000 UTC m=+1350.448644347" watchObservedRunningTime="2026-03-01 09:30:21.218579633 +0000 UTC m=+1350.460458830" Mar 01 09:30:22 crc kubenswrapper[4792]: I0301 09:30:22.205098 4792 generic.go:334] "Generic (PLEG): container finished" podID="32a84376-7418-49cd-9c62-fdd1af7ec31b" containerID="cc49a2b9acb35bd4588c6c9cb6d10085e66d67e1476ad98c515c87fcc8a40be2" exitCode=0 Mar 01 09:30:22 crc kubenswrapper[4792]: I0301 09:30:22.205226 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-8vfwt" event={"ID":"32a84376-7418-49cd-9c62-fdd1af7ec31b","Type":"ContainerDied","Data":"cc49a2b9acb35bd4588c6c9cb6d10085e66d67e1476ad98c515c87fcc8a40be2"} Mar 01 09:30:22 crc kubenswrapper[4792]: I0301 09:30:22.314706 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 01 09:30:22 crc kubenswrapper[4792]: I0301 09:30:22.314955 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 01 09:30:22 crc kubenswrapper[4792]: I0301 09:30:22.654287 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 01 09:30:22 crc kubenswrapper[4792]: I0301 09:30:22.654326 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 01 09:30:22 crc kubenswrapper[4792]: I0301 09:30:22.694005 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 01 09:30:22 crc kubenswrapper[4792]: I0301 09:30:22.715161 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:30:22 crc kubenswrapper[4792]: I0301 09:30:22.749824 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:22 crc kubenswrapper[4792]: I0301 09:30:22.785922 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7675674687-rrbg6"] Mar 01 09:30:22 crc kubenswrapper[4792]: I0301 09:30:22.786394 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7675674687-rrbg6" podUID="b5f281f2-c77c-49cc-93a4-e7ed029f29bb" containerName="dnsmasq-dns" containerID="cri-o://fa470bfaa3e77451b41cb706fe5366ce1e220ca16dcf36e41c6bb50c6ad8d869" gracePeriod=10 Mar 01 09:30:22 crc kubenswrapper[4792]: I0301 09:30:22.972336 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7675674687-rrbg6" podUID="b5f281f2-c77c-49cc-93a4-e7ed029f29bb" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.155:5353: connect: connection refused" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.219264 4792 generic.go:334] "Generic (PLEG): container finished" podID="7269b8b7-440f-4fae-b0f1-f624e9d5b29a" containerID="7b120b9d05aec1bbdb715a0cec1430208c27b75e09d6308e245c67d773de0e22" exitCode=0 Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.219344 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tjd85" event={"ID":"7269b8b7-440f-4fae-b0f1-f624e9d5b29a","Type":"ContainerDied","Data":"7b120b9d05aec1bbdb715a0cec1430208c27b75e09d6308e245c67d773de0e22"} Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.237842 4792 generic.go:334] "Generic (PLEG): container finished" podID="b5f281f2-c77c-49cc-93a4-e7ed029f29bb" containerID="fa470bfaa3e77451b41cb706fe5366ce1e220ca16dcf36e41c6bb50c6ad8d869" exitCode=0 Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.238262 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7675674687-rrbg6" event={"ID":"b5f281f2-c77c-49cc-93a4-e7ed029f29bb","Type":"ContainerDied","Data":"fa470bfaa3e77451b41cb706fe5366ce1e220ca16dcf36e41c6bb50c6ad8d869"} Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.293736 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.314000 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.404548 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="49690073-1340-4686-bc4b-f69901bb45d9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.179:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.404577 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="49690073-1340-4686-bc4b-f69901bb45d9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.179:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.490747 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2j7zl\" (UniqueName: \"kubernetes.io/projected/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-kube-api-access-2j7zl\") pod \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.490844 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-ovsdbserver-sb\") pod \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.491031 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-dns-svc\") pod \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.491083 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-ovsdbserver-nb\") pod \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.491130 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-config\") pod \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\" (UID: \"b5f281f2-c77c-49cc-93a4-e7ed029f29bb\") " Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.522736 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-kube-api-access-2j7zl" (OuterVolumeSpecName: "kube-api-access-2j7zl") pod "b5f281f2-c77c-49cc-93a4-e7ed029f29bb" (UID: "b5f281f2-c77c-49cc-93a4-e7ed029f29bb"). InnerVolumeSpecName "kube-api-access-2j7zl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.569653 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b5f281f2-c77c-49cc-93a4-e7ed029f29bb" (UID: "b5f281f2-c77c-49cc-93a4-e7ed029f29bb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.581695 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b5f281f2-c77c-49cc-93a4-e7ed029f29bb" (UID: "b5f281f2-c77c-49cc-93a4-e7ed029f29bb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.593933 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2j7zl\" (UniqueName: \"kubernetes.io/projected/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-kube-api-access-2j7zl\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.593962 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.593971 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.598119 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b5f281f2-c77c-49cc-93a4-e7ed029f29bb" (UID: "b5f281f2-c77c-49cc-93a4-e7ed029f29bb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.613482 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-config" (OuterVolumeSpecName: "config") pod "b5f281f2-c77c-49cc-93a4-e7ed029f29bb" (UID: "b5f281f2-c77c-49cc-93a4-e7ed029f29bb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.638222 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-8vfwt" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.697335 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.697372 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5f281f2-c77c-49cc-93a4-e7ed029f29bb-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.798437 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32a84376-7418-49cd-9c62-fdd1af7ec31b-combined-ca-bundle\") pod \"32a84376-7418-49cd-9c62-fdd1af7ec31b\" (UID: \"32a84376-7418-49cd-9c62-fdd1af7ec31b\") " Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.798496 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32a84376-7418-49cd-9c62-fdd1af7ec31b-scripts\") pod \"32a84376-7418-49cd-9c62-fdd1af7ec31b\" (UID: \"32a84376-7418-49cd-9c62-fdd1af7ec31b\") " Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.798639 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8l625\" (UniqueName: \"kubernetes.io/projected/32a84376-7418-49cd-9c62-fdd1af7ec31b-kube-api-access-8l625\") pod \"32a84376-7418-49cd-9c62-fdd1af7ec31b\" (UID: \"32a84376-7418-49cd-9c62-fdd1af7ec31b\") " Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.798896 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32a84376-7418-49cd-9c62-fdd1af7ec31b-config-data\") pod \"32a84376-7418-49cd-9c62-fdd1af7ec31b\" (UID: \"32a84376-7418-49cd-9c62-fdd1af7ec31b\") " Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.809538 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32a84376-7418-49cd-9c62-fdd1af7ec31b-kube-api-access-8l625" (OuterVolumeSpecName: "kube-api-access-8l625") pod "32a84376-7418-49cd-9c62-fdd1af7ec31b" (UID: "32a84376-7418-49cd-9c62-fdd1af7ec31b"). InnerVolumeSpecName "kube-api-access-8l625". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.811111 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32a84376-7418-49cd-9c62-fdd1af7ec31b-scripts" (OuterVolumeSpecName: "scripts") pod "32a84376-7418-49cd-9c62-fdd1af7ec31b" (UID: "32a84376-7418-49cd-9c62-fdd1af7ec31b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.840059 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32a84376-7418-49cd-9c62-fdd1af7ec31b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32a84376-7418-49cd-9c62-fdd1af7ec31b" (UID: "32a84376-7418-49cd-9c62-fdd1af7ec31b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.847676 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32a84376-7418-49cd-9c62-fdd1af7ec31b-config-data" (OuterVolumeSpecName: "config-data") pod "32a84376-7418-49cd-9c62-fdd1af7ec31b" (UID: "32a84376-7418-49cd-9c62-fdd1af7ec31b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.904988 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32a84376-7418-49cd-9c62-fdd1af7ec31b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.905027 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32a84376-7418-49cd-9c62-fdd1af7ec31b-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.905040 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8l625\" (UniqueName: \"kubernetes.io/projected/32a84376-7418-49cd-9c62-fdd1af7ec31b-kube-api-access-8l625\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:23 crc kubenswrapper[4792]: I0301 09:30:23.905053 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32a84376-7418-49cd-9c62-fdd1af7ec31b-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.255961 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-8vfwt" event={"ID":"32a84376-7418-49cd-9c62-fdd1af7ec31b","Type":"ContainerDied","Data":"3c68bb44e29a7b14a923b915dc00892bf89acf28bc6734efebeb0e2ee1c07a61"} Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.256268 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c68bb44e29a7b14a923b915dc00892bf89acf28bc6734efebeb0e2ee1c07a61" Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.256328 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-8vfwt" Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.268656 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7675674687-rrbg6" Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.269044 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7675674687-rrbg6" event={"ID":"b5f281f2-c77c-49cc-93a4-e7ed029f29bb","Type":"ContainerDied","Data":"23c50a76020a45988e337315d0efa1b099af136e2bf3deb4ff1bf47f7e64507f"} Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.269159 4792 scope.go:117] "RemoveContainer" containerID="fa470bfaa3e77451b41cb706fe5366ce1e220ca16dcf36e41c6bb50c6ad8d869" Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.320594 4792 scope.go:117] "RemoveContainer" containerID="1bc8af094ffa7e776635e0ebadb742f3592eec13f050af88c7f45cc46dd6b7ae" Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.338019 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7675674687-rrbg6"] Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.355249 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7675674687-rrbg6"] Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.389061 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.389508 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="49690073-1340-4686-bc4b-f69901bb45d9" containerName="nova-api-log" containerID="cri-o://0a8da701899d6dbb39ad32c67cda198a10066ca5649f3a3b10692182aafa59b8" gracePeriod=30 Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.390024 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="49690073-1340-4686-bc4b-f69901bb45d9" containerName="nova-api-api" containerID="cri-o://e37d4adb8d9d5ffbf813442c1fdbcfbbf0812ab4b1f3d2f82a5ad52ffcbb0ef7" gracePeriod=30 Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.403340 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.417199 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.417460 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0521e997-0d3e-4e56-9302-cbd2c79e2c0a" containerName="nova-metadata-log" containerID="cri-o://08fff41a871a4698232ae6125185f351896691d6f802d9111c8ea965e8178547" gracePeriod=30 Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.417814 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0521e997-0d3e-4e56-9302-cbd2c79e2c0a" containerName="nova-metadata-metadata" containerID="cri-o://f6d95cb3f5287f3e4b2a3379b854f7695bc69ab2f663fff0b92f66887b19c4ff" gracePeriod=30 Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.605325 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.605366 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.788409 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tjd85" Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.931758 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5qnd\" (UniqueName: \"kubernetes.io/projected/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-kube-api-access-l5qnd\") pod \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\" (UID: \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\") " Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.931807 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-config-data\") pod \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\" (UID: \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\") " Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.932016 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-combined-ca-bundle\") pod \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\" (UID: \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\") " Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.932052 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-scripts\") pod \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\" (UID: \"7269b8b7-440f-4fae-b0f1-f624e9d5b29a\") " Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.945030 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-scripts" (OuterVolumeSpecName: "scripts") pod "7269b8b7-440f-4fae-b0f1-f624e9d5b29a" (UID: "7269b8b7-440f-4fae-b0f1-f624e9d5b29a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.961625 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-kube-api-access-l5qnd" (OuterVolumeSpecName: "kube-api-access-l5qnd") pod "7269b8b7-440f-4fae-b0f1-f624e9d5b29a" (UID: "7269b8b7-440f-4fae-b0f1-f624e9d5b29a"). InnerVolumeSpecName "kube-api-access-l5qnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:30:24 crc kubenswrapper[4792]: I0301 09:30:24.987384 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7269b8b7-440f-4fae-b0f1-f624e9d5b29a" (UID: "7269b8b7-440f-4fae-b0f1-f624e9d5b29a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.002579 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-config-data" (OuterVolumeSpecName: "config-data") pod "7269b8b7-440f-4fae-b0f1-f624e9d5b29a" (UID: "7269b8b7-440f-4fae-b0f1-f624e9d5b29a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.048209 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5qnd\" (UniqueName: \"kubernetes.io/projected/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-kube-api-access-l5qnd\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.048248 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.048283 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.048293 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7269b8b7-440f-4fae-b0f1-f624e9d5b29a-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.147170 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.250534 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-config-data\") pod \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.250731 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-logs\") pod \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.250782 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-combined-ca-bundle\") pod \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.250868 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bshv\" (UniqueName: \"kubernetes.io/projected/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-kube-api-access-8bshv\") pod \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.250952 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-nova-metadata-tls-certs\") pod \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\" (UID: \"0521e997-0d3e-4e56-9302-cbd2c79e2c0a\") " Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.251140 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-logs" (OuterVolumeSpecName: "logs") pod "0521e997-0d3e-4e56-9302-cbd2c79e2c0a" (UID: "0521e997-0d3e-4e56-9302-cbd2c79e2c0a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.251607 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-logs\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.258047 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-kube-api-access-8bshv" (OuterVolumeSpecName: "kube-api-access-8bshv") pod "0521e997-0d3e-4e56-9302-cbd2c79e2c0a" (UID: "0521e997-0d3e-4e56-9302-cbd2c79e2c0a"). InnerVolumeSpecName "kube-api-access-8bshv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.283991 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0521e997-0d3e-4e56-9302-cbd2c79e2c0a" (UID: "0521e997-0d3e-4e56-9302-cbd2c79e2c0a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.290578 4792 generic.go:334] "Generic (PLEG): container finished" podID="0521e997-0d3e-4e56-9302-cbd2c79e2c0a" containerID="f6d95cb3f5287f3e4b2a3379b854f7695bc69ab2f663fff0b92f66887b19c4ff" exitCode=0 Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.290638 4792 generic.go:334] "Generic (PLEG): container finished" podID="0521e997-0d3e-4e56-9302-cbd2c79e2c0a" containerID="08fff41a871a4698232ae6125185f351896691d6f802d9111c8ea965e8178547" exitCode=143 Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.290715 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0521e997-0d3e-4e56-9302-cbd2c79e2c0a","Type":"ContainerDied","Data":"f6d95cb3f5287f3e4b2a3379b854f7695bc69ab2f663fff0b92f66887b19c4ff"} Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.290750 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0521e997-0d3e-4e56-9302-cbd2c79e2c0a","Type":"ContainerDied","Data":"08fff41a871a4698232ae6125185f351896691d6f802d9111c8ea965e8178547"} Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.290790 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0521e997-0d3e-4e56-9302-cbd2c79e2c0a","Type":"ContainerDied","Data":"327e848ad4f5537716892bd8c75c4caf7104a2e6089d9a069750a5ed9373fae4"} Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.290808 4792 scope.go:117] "RemoveContainer" containerID="f6d95cb3f5287f3e4b2a3379b854f7695bc69ab2f663fff0b92f66887b19c4ff" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.290986 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.302182 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-config-data" (OuterVolumeSpecName: "config-data") pod "0521e997-0d3e-4e56-9302-cbd2c79e2c0a" (UID: "0521e997-0d3e-4e56-9302-cbd2c79e2c0a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.303361 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tjd85" event={"ID":"7269b8b7-440f-4fae-b0f1-f624e9d5b29a","Type":"ContainerDied","Data":"c0f91939f7bb3b7f30d5eba3a1ceca8589d9bda852653f48b1879bd42cead480"} Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.303395 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0f91939f7bb3b7f30d5eba3a1ceca8589d9bda852653f48b1879bd42cead480" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.303466 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tjd85" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.312797 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.313155 4792 generic.go:334] "Generic (PLEG): container finished" podID="49690073-1340-4686-bc4b-f69901bb45d9" containerID="0a8da701899d6dbb39ad32c67cda198a10066ca5649f3a3b10692182aafa59b8" exitCode=143 Mar 01 09:30:25 crc kubenswrapper[4792]: E0301 09:30:25.313288 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0521e997-0d3e-4e56-9302-cbd2c79e2c0a" containerName="nova-metadata-metadata" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.313310 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0521e997-0d3e-4e56-9302-cbd2c79e2c0a" containerName="nova-metadata-metadata" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.313319 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b8015a6e-cf5d-4728-b2e6-66bb8960fd40" containerName="nova-scheduler-scheduler" containerID="cri-o://43517df943397eb0f37738ca82db7645bf4ef1a52cc7a3b9599b08d521c29e03" gracePeriod=30 Mar 01 09:30:25 crc kubenswrapper[4792]: E0301 09:30:25.313329 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5f281f2-c77c-49cc-93a4-e7ed029f29bb" containerName="dnsmasq-dns" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.313339 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f281f2-c77c-49cc-93a4-e7ed029f29bb" containerName="dnsmasq-dns" Mar 01 09:30:25 crc kubenswrapper[4792]: E0301 09:30:25.313358 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5f281f2-c77c-49cc-93a4-e7ed029f29bb" containerName="init" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.313367 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f281f2-c77c-49cc-93a4-e7ed029f29bb" containerName="init" Mar 01 09:30:25 crc kubenswrapper[4792]: E0301 09:30:25.313379 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7269b8b7-440f-4fae-b0f1-f624e9d5b29a" containerName="nova-cell1-conductor-db-sync" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.313387 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7269b8b7-440f-4fae-b0f1-f624e9d5b29a" containerName="nova-cell1-conductor-db-sync" Mar 01 09:30:25 crc kubenswrapper[4792]: E0301 09:30:25.313402 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0521e997-0d3e-4e56-9302-cbd2c79e2c0a" containerName="nova-metadata-log" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.313411 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0521e997-0d3e-4e56-9302-cbd2c79e2c0a" containerName="nova-metadata-log" Mar 01 09:30:25 crc kubenswrapper[4792]: E0301 09:30:25.313436 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a84376-7418-49cd-9c62-fdd1af7ec31b" containerName="nova-manage" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.313445 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a84376-7418-49cd-9c62-fdd1af7ec31b" containerName="nova-manage" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.313661 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="32a84376-7418-49cd-9c62-fdd1af7ec31b" containerName="nova-manage" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.313676 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7269b8b7-440f-4fae-b0f1-f624e9d5b29a" containerName="nova-cell1-conductor-db-sync" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.313687 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5f281f2-c77c-49cc-93a4-e7ed029f29bb" containerName="dnsmasq-dns" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.313705 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0521e997-0d3e-4e56-9302-cbd2c79e2c0a" containerName="nova-metadata-log" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.313724 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0521e997-0d3e-4e56-9302-cbd2c79e2c0a" containerName="nova-metadata-metadata" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.316810 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"49690073-1340-4686-bc4b-f69901bb45d9","Type":"ContainerDied","Data":"0a8da701899d6dbb39ad32c67cda198a10066ca5649f3a3b10692182aafa59b8"} Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.316963 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.320205 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.338607 4792 scope.go:117] "RemoveContainer" containerID="08fff41a871a4698232ae6125185f351896691d6f802d9111c8ea965e8178547" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.374155 4792 scope.go:117] "RemoveContainer" containerID="f6d95cb3f5287f3e4b2a3379b854f7695bc69ab2f663fff0b92f66887b19c4ff" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.375456 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.376605 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ef6cc4e-2fd6-403b-a163-638395c30672-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9ef6cc4e-2fd6-403b-a163-638395c30672\") " pod="openstack/nova-cell1-conductor-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.376649 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ef6cc4e-2fd6-403b-a163-638395c30672-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9ef6cc4e-2fd6-403b-a163-638395c30672\") " pod="openstack/nova-cell1-conductor-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.376704 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4rfg\" (UniqueName: \"kubernetes.io/projected/9ef6cc4e-2fd6-403b-a163-638395c30672-kube-api-access-z4rfg\") pod \"nova-cell1-conductor-0\" (UID: \"9ef6cc4e-2fd6-403b-a163-638395c30672\") " pod="openstack/nova-cell1-conductor-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.376795 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.376807 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bshv\" (UniqueName: \"kubernetes.io/projected/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-kube-api-access-8bshv\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.376817 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:25 crc kubenswrapper[4792]: E0301 09:30:25.377787 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6d95cb3f5287f3e4b2a3379b854f7695bc69ab2f663fff0b92f66887b19c4ff\": container with ID starting with f6d95cb3f5287f3e4b2a3379b854f7695bc69ab2f663fff0b92f66887b19c4ff not found: ID does not exist" containerID="f6d95cb3f5287f3e4b2a3379b854f7695bc69ab2f663fff0b92f66887b19c4ff" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.377834 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6d95cb3f5287f3e4b2a3379b854f7695bc69ab2f663fff0b92f66887b19c4ff"} err="failed to get container status \"f6d95cb3f5287f3e4b2a3379b854f7695bc69ab2f663fff0b92f66887b19c4ff\": rpc error: code = NotFound desc = could not find container \"f6d95cb3f5287f3e4b2a3379b854f7695bc69ab2f663fff0b92f66887b19c4ff\": container with ID starting with f6d95cb3f5287f3e4b2a3379b854f7695bc69ab2f663fff0b92f66887b19c4ff not found: ID does not exist" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.377867 4792 scope.go:117] "RemoveContainer" containerID="08fff41a871a4698232ae6125185f351896691d6f802d9111c8ea965e8178547" Mar 01 09:30:25 crc kubenswrapper[4792]: E0301 09:30:25.380113 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08fff41a871a4698232ae6125185f351896691d6f802d9111c8ea965e8178547\": container with ID starting with 08fff41a871a4698232ae6125185f351896691d6f802d9111c8ea965e8178547 not found: ID does not exist" containerID="08fff41a871a4698232ae6125185f351896691d6f802d9111c8ea965e8178547" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.380152 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "0521e997-0d3e-4e56-9302-cbd2c79e2c0a" (UID: "0521e997-0d3e-4e56-9302-cbd2c79e2c0a"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.380148 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08fff41a871a4698232ae6125185f351896691d6f802d9111c8ea965e8178547"} err="failed to get container status \"08fff41a871a4698232ae6125185f351896691d6f802d9111c8ea965e8178547\": rpc error: code = NotFound desc = could not find container \"08fff41a871a4698232ae6125185f351896691d6f802d9111c8ea965e8178547\": container with ID starting with 08fff41a871a4698232ae6125185f351896691d6f802d9111c8ea965e8178547 not found: ID does not exist" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.380187 4792 scope.go:117] "RemoveContainer" containerID="f6d95cb3f5287f3e4b2a3379b854f7695bc69ab2f663fff0b92f66887b19c4ff" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.380482 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6d95cb3f5287f3e4b2a3379b854f7695bc69ab2f663fff0b92f66887b19c4ff"} err="failed to get container status \"f6d95cb3f5287f3e4b2a3379b854f7695bc69ab2f663fff0b92f66887b19c4ff\": rpc error: code = NotFound desc = could not find container \"f6d95cb3f5287f3e4b2a3379b854f7695bc69ab2f663fff0b92f66887b19c4ff\": container with ID starting with f6d95cb3f5287f3e4b2a3379b854f7695bc69ab2f663fff0b92f66887b19c4ff not found: ID does not exist" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.380505 4792 scope.go:117] "RemoveContainer" containerID="08fff41a871a4698232ae6125185f351896691d6f802d9111c8ea965e8178547" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.380827 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08fff41a871a4698232ae6125185f351896691d6f802d9111c8ea965e8178547"} err="failed to get container status \"08fff41a871a4698232ae6125185f351896691d6f802d9111c8ea965e8178547\": rpc error: code = NotFound desc = could not find container \"08fff41a871a4698232ae6125185f351896691d6f802d9111c8ea965e8178547\": container with ID starting with 08fff41a871a4698232ae6125185f351896691d6f802d9111c8ea965e8178547 not found: ID does not exist" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.439131 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5f281f2-c77c-49cc-93a4-e7ed029f29bb" path="/var/lib/kubelet/pods/b5f281f2-c77c-49cc-93a4-e7ed029f29bb/volumes" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.479476 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ef6cc4e-2fd6-403b-a163-638395c30672-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9ef6cc4e-2fd6-403b-a163-638395c30672\") " pod="openstack/nova-cell1-conductor-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.479558 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ef6cc4e-2fd6-403b-a163-638395c30672-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9ef6cc4e-2fd6-403b-a163-638395c30672\") " pod="openstack/nova-cell1-conductor-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.479624 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4rfg\" (UniqueName: \"kubernetes.io/projected/9ef6cc4e-2fd6-403b-a163-638395c30672-kube-api-access-z4rfg\") pod \"nova-cell1-conductor-0\" (UID: \"9ef6cc4e-2fd6-403b-a163-638395c30672\") " pod="openstack/nova-cell1-conductor-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.479802 4792 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0521e997-0d3e-4e56-9302-cbd2c79e2c0a-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.483841 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ef6cc4e-2fd6-403b-a163-638395c30672-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9ef6cc4e-2fd6-403b-a163-638395c30672\") " pod="openstack/nova-cell1-conductor-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.485255 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ef6cc4e-2fd6-403b-a163-638395c30672-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9ef6cc4e-2fd6-403b-a163-638395c30672\") " pod="openstack/nova-cell1-conductor-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.496441 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4rfg\" (UniqueName: \"kubernetes.io/projected/9ef6cc4e-2fd6-403b-a163-638395c30672-kube-api-access-z4rfg\") pod \"nova-cell1-conductor-0\" (UID: \"9ef6cc4e-2fd6-403b-a163-638395c30672\") " pod="openstack/nova-cell1-conductor-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.618296 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.625592 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.641879 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.652130 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.653767 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.666123 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.666324 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.691971 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.786033 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " pod="openstack/nova-metadata-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.786114 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-config-data\") pod \"nova-metadata-0\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " pod="openstack/nova-metadata-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.786138 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqzd2\" (UniqueName: \"kubernetes.io/projected/66b40740-5f2c-4f3a-9d20-3307335829ed-kube-api-access-lqzd2\") pod \"nova-metadata-0\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " pod="openstack/nova-metadata-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.786189 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " pod="openstack/nova-metadata-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.786427 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66b40740-5f2c-4f3a-9d20-3307335829ed-logs\") pod \"nova-metadata-0\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " pod="openstack/nova-metadata-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.889096 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66b40740-5f2c-4f3a-9d20-3307335829ed-logs\") pod \"nova-metadata-0\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " pod="openstack/nova-metadata-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.889959 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " pod="openstack/nova-metadata-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.890006 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-config-data\") pod \"nova-metadata-0\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " pod="openstack/nova-metadata-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.890023 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqzd2\" (UniqueName: \"kubernetes.io/projected/66b40740-5f2c-4f3a-9d20-3307335829ed-kube-api-access-lqzd2\") pod \"nova-metadata-0\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " pod="openstack/nova-metadata-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.890077 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " pod="openstack/nova-metadata-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.889862 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66b40740-5f2c-4f3a-9d20-3307335829ed-logs\") pod \"nova-metadata-0\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " pod="openstack/nova-metadata-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.894339 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " pod="openstack/nova-metadata-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.894412 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-config-data\") pod \"nova-metadata-0\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " pod="openstack/nova-metadata-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.894752 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " pod="openstack/nova-metadata-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.932638 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqzd2\" (UniqueName: \"kubernetes.io/projected/66b40740-5f2c-4f3a-9d20-3307335829ed-kube-api-access-lqzd2\") pod \"nova-metadata-0\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " pod="openstack/nova-metadata-0" Mar 01 09:30:25 crc kubenswrapper[4792]: I0301 09:30:25.970876 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 01 09:30:26 crc kubenswrapper[4792]: I0301 09:30:26.089345 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 01 09:30:26 crc kubenswrapper[4792]: I0301 09:30:26.330678 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9ef6cc4e-2fd6-403b-a163-638395c30672","Type":"ContainerStarted","Data":"3c916f3a8483dfafcf41e54c045dfc799303a49d176cbd1bf0abdc70d9f1810a"} Mar 01 09:30:26 crc kubenswrapper[4792]: I0301 09:30:26.331043 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 01 09:30:26 crc kubenswrapper[4792]: I0301 09:30:26.331056 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9ef6cc4e-2fd6-403b-a163-638395c30672","Type":"ContainerStarted","Data":"73174187fe252d0dfc1749cb6d328c9b3e5fc0df9679a189709faf90120b6094"} Mar 01 09:30:26 crc kubenswrapper[4792]: I0301 09:30:26.352802 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.352786212 podStartE2EDuration="1.352786212s" podCreationTimestamp="2026-03-01 09:30:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:30:26.348191351 +0000 UTC m=+1355.590070548" watchObservedRunningTime="2026-03-01 09:30:26.352786212 +0000 UTC m=+1355.594665419" Mar 01 09:30:26 crc kubenswrapper[4792]: I0301 09:30:26.464116 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:30:26 crc kubenswrapper[4792]: W0301 09:30:26.465348 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66b40740_5f2c_4f3a_9d20_3307335829ed.slice/crio-243858d3fba357536368a417ecf1a917b39cc53d20c017fb5d7083ee13365f6d WatchSource:0}: Error finding container 243858d3fba357536368a417ecf1a917b39cc53d20c017fb5d7083ee13365f6d: Status 404 returned error can't find the container with id 243858d3fba357536368a417ecf1a917b39cc53d20c017fb5d7083ee13365f6d Mar 01 09:30:27 crc kubenswrapper[4792]: I0301 09:30:27.040104 4792 scope.go:117] "RemoveContainer" containerID="25e2a65861bddb5cf69014ea8d6e4a60ec1aeeeac6538cad770632d905286110" Mar 01 09:30:27 crc kubenswrapper[4792]: I0301 09:30:27.346344 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"66b40740-5f2c-4f3a-9d20-3307335829ed","Type":"ContainerStarted","Data":"a14ebd244facee732540cda9c7f5ef2f0b1a5f3035c360a40a8a1d517d895ef8"} Mar 01 09:30:27 crc kubenswrapper[4792]: I0301 09:30:27.346855 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"66b40740-5f2c-4f3a-9d20-3307335829ed","Type":"ContainerStarted","Data":"4824e5f7f0b4542375429e0324e26b504dd03e3e54b0250d4667215c3a5784ed"} Mar 01 09:30:27 crc kubenswrapper[4792]: I0301 09:30:27.346875 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"66b40740-5f2c-4f3a-9d20-3307335829ed","Type":"ContainerStarted","Data":"243858d3fba357536368a417ecf1a917b39cc53d20c017fb5d7083ee13365f6d"} Mar 01 09:30:27 crc kubenswrapper[4792]: I0301 09:30:27.376210 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.37619207 podStartE2EDuration="2.37619207s" podCreationTimestamp="2026-03-01 09:30:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:30:27.372590514 +0000 UTC m=+1356.614469721" watchObservedRunningTime="2026-03-01 09:30:27.37619207 +0000 UTC m=+1356.618071267" Mar 01 09:30:27 crc kubenswrapper[4792]: I0301 09:30:27.429504 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0521e997-0d3e-4e56-9302-cbd2c79e2c0a" path="/var/lib/kubelet/pods/0521e997-0d3e-4e56-9302-cbd2c79e2c0a/volumes" Mar 01 09:30:27 crc kubenswrapper[4792]: E0301 09:30:27.656300 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="43517df943397eb0f37738ca82db7645bf4ef1a52cc7a3b9599b08d521c29e03" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 01 09:30:27 crc kubenswrapper[4792]: E0301 09:30:27.658087 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="43517df943397eb0f37738ca82db7645bf4ef1a52cc7a3b9599b08d521c29e03" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 01 09:30:27 crc kubenswrapper[4792]: E0301 09:30:27.659760 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="43517df943397eb0f37738ca82db7645bf4ef1a52cc7a3b9599b08d521c29e03" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 01 09:30:27 crc kubenswrapper[4792]: E0301 09:30:27.659819 4792 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="b8015a6e-cf5d-4728-b2e6-66bb8960fd40" containerName="nova-scheduler-scheduler" Mar 01 09:30:29 crc kubenswrapper[4792]: I0301 09:30:29.364780 4792 generic.go:334] "Generic (PLEG): container finished" podID="b8015a6e-cf5d-4728-b2e6-66bb8960fd40" containerID="43517df943397eb0f37738ca82db7645bf4ef1a52cc7a3b9599b08d521c29e03" exitCode=0 Mar 01 09:30:29 crc kubenswrapper[4792]: I0301 09:30:29.364801 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b8015a6e-cf5d-4728-b2e6-66bb8960fd40","Type":"ContainerDied","Data":"43517df943397eb0f37738ca82db7645bf4ef1a52cc7a3b9599b08d521c29e03"} Mar 01 09:30:29 crc kubenswrapper[4792]: I0301 09:30:29.490055 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 01 09:30:29 crc kubenswrapper[4792]: I0301 09:30:29.661268 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8015a6e-cf5d-4728-b2e6-66bb8960fd40-config-data\") pod \"b8015a6e-cf5d-4728-b2e6-66bb8960fd40\" (UID: \"b8015a6e-cf5d-4728-b2e6-66bb8960fd40\") " Mar 01 09:30:29 crc kubenswrapper[4792]: I0301 09:30:29.661408 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwdw9\" (UniqueName: \"kubernetes.io/projected/b8015a6e-cf5d-4728-b2e6-66bb8960fd40-kube-api-access-pwdw9\") pod \"b8015a6e-cf5d-4728-b2e6-66bb8960fd40\" (UID: \"b8015a6e-cf5d-4728-b2e6-66bb8960fd40\") " Mar 01 09:30:29 crc kubenswrapper[4792]: I0301 09:30:29.661431 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8015a6e-cf5d-4728-b2e6-66bb8960fd40-combined-ca-bundle\") pod \"b8015a6e-cf5d-4728-b2e6-66bb8960fd40\" (UID: \"b8015a6e-cf5d-4728-b2e6-66bb8960fd40\") " Mar 01 09:30:29 crc kubenswrapper[4792]: I0301 09:30:29.667526 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8015a6e-cf5d-4728-b2e6-66bb8960fd40-kube-api-access-pwdw9" (OuterVolumeSpecName: "kube-api-access-pwdw9") pod "b8015a6e-cf5d-4728-b2e6-66bb8960fd40" (UID: "b8015a6e-cf5d-4728-b2e6-66bb8960fd40"). InnerVolumeSpecName "kube-api-access-pwdw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:30:29 crc kubenswrapper[4792]: I0301 09:30:29.686252 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8015a6e-cf5d-4728-b2e6-66bb8960fd40-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8015a6e-cf5d-4728-b2e6-66bb8960fd40" (UID: "b8015a6e-cf5d-4728-b2e6-66bb8960fd40"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:29 crc kubenswrapper[4792]: I0301 09:30:29.692422 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8015a6e-cf5d-4728-b2e6-66bb8960fd40-config-data" (OuterVolumeSpecName: "config-data") pod "b8015a6e-cf5d-4728-b2e6-66bb8960fd40" (UID: "b8015a6e-cf5d-4728-b2e6-66bb8960fd40"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:29 crc kubenswrapper[4792]: I0301 09:30:29.763493 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8015a6e-cf5d-4728-b2e6-66bb8960fd40-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:29 crc kubenswrapper[4792]: I0301 09:30:29.763540 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwdw9\" (UniqueName: \"kubernetes.io/projected/b8015a6e-cf5d-4728-b2e6-66bb8960fd40-kube-api-access-pwdw9\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:29 crc kubenswrapper[4792]: I0301 09:30:29.763557 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8015a6e-cf5d-4728-b2e6-66bb8960fd40-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.246623 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.278122 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49690073-1340-4686-bc4b-f69901bb45d9-combined-ca-bundle\") pod \"49690073-1340-4686-bc4b-f69901bb45d9\" (UID: \"49690073-1340-4686-bc4b-f69901bb45d9\") " Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.278162 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49690073-1340-4686-bc4b-f69901bb45d9-config-data\") pod \"49690073-1340-4686-bc4b-f69901bb45d9\" (UID: \"49690073-1340-4686-bc4b-f69901bb45d9\") " Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.278257 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49690073-1340-4686-bc4b-f69901bb45d9-logs\") pod \"49690073-1340-4686-bc4b-f69901bb45d9\" (UID: \"49690073-1340-4686-bc4b-f69901bb45d9\") " Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.278316 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvskn\" (UniqueName: \"kubernetes.io/projected/49690073-1340-4686-bc4b-f69901bb45d9-kube-api-access-dvskn\") pod \"49690073-1340-4686-bc4b-f69901bb45d9\" (UID: \"49690073-1340-4686-bc4b-f69901bb45d9\") " Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.278970 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49690073-1340-4686-bc4b-f69901bb45d9-logs" (OuterVolumeSpecName: "logs") pod "49690073-1340-4686-bc4b-f69901bb45d9" (UID: "49690073-1340-4686-bc4b-f69901bb45d9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.292119 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49690073-1340-4686-bc4b-f69901bb45d9-kube-api-access-dvskn" (OuterVolumeSpecName: "kube-api-access-dvskn") pod "49690073-1340-4686-bc4b-f69901bb45d9" (UID: "49690073-1340-4686-bc4b-f69901bb45d9"). InnerVolumeSpecName "kube-api-access-dvskn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.309230 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49690073-1340-4686-bc4b-f69901bb45d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49690073-1340-4686-bc4b-f69901bb45d9" (UID: "49690073-1340-4686-bc4b-f69901bb45d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.313065 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49690073-1340-4686-bc4b-f69901bb45d9-config-data" (OuterVolumeSpecName: "config-data") pod "49690073-1340-4686-bc4b-f69901bb45d9" (UID: "49690073-1340-4686-bc4b-f69901bb45d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.377757 4792 generic.go:334] "Generic (PLEG): container finished" podID="49690073-1340-4686-bc4b-f69901bb45d9" containerID="e37d4adb8d9d5ffbf813442c1fdbcfbbf0812ab4b1f3d2f82a5ad52ffcbb0ef7" exitCode=0 Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.377936 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"49690073-1340-4686-bc4b-f69901bb45d9","Type":"ContainerDied","Data":"e37d4adb8d9d5ffbf813442c1fdbcfbbf0812ab4b1f3d2f82a5ad52ffcbb0ef7"} Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.378526 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"49690073-1340-4686-bc4b-f69901bb45d9","Type":"ContainerDied","Data":"527b5d3ab3c0911a39430a25a4440c1f01d3f9995da50cf30f22affc0352f613"} Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.378559 4792 scope.go:117] "RemoveContainer" containerID="e37d4adb8d9d5ffbf813442c1fdbcfbbf0812ab4b1f3d2f82a5ad52ffcbb0ef7" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.378086 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.380986 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49690073-1340-4686-bc4b-f69901bb45d9-logs\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.381010 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvskn\" (UniqueName: \"kubernetes.io/projected/49690073-1340-4686-bc4b-f69901bb45d9-kube-api-access-dvskn\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.381061 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49690073-1340-4686-bc4b-f69901bb45d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.381075 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49690073-1340-4686-bc4b-f69901bb45d9-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.391499 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b8015a6e-cf5d-4728-b2e6-66bb8960fd40","Type":"ContainerDied","Data":"8d6bd74bfd2c811ba8feb074f6cd879de61baffea6857711abd8ac6646b3af71"} Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.391546 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.403978 4792 scope.go:117] "RemoveContainer" containerID="0a8da701899d6dbb39ad32c67cda198a10066ca5649f3a3b10692182aafa59b8" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.431789 4792 scope.go:117] "RemoveContainer" containerID="e37d4adb8d9d5ffbf813442c1fdbcfbbf0812ab4b1f3d2f82a5ad52ffcbb0ef7" Mar 01 09:30:30 crc kubenswrapper[4792]: E0301 09:30:30.432299 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e37d4adb8d9d5ffbf813442c1fdbcfbbf0812ab4b1f3d2f82a5ad52ffcbb0ef7\": container with ID starting with e37d4adb8d9d5ffbf813442c1fdbcfbbf0812ab4b1f3d2f82a5ad52ffcbb0ef7 not found: ID does not exist" containerID="e37d4adb8d9d5ffbf813442c1fdbcfbbf0812ab4b1f3d2f82a5ad52ffcbb0ef7" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.432356 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e37d4adb8d9d5ffbf813442c1fdbcfbbf0812ab4b1f3d2f82a5ad52ffcbb0ef7"} err="failed to get container status \"e37d4adb8d9d5ffbf813442c1fdbcfbbf0812ab4b1f3d2f82a5ad52ffcbb0ef7\": rpc error: code = NotFound desc = could not find container \"e37d4adb8d9d5ffbf813442c1fdbcfbbf0812ab4b1f3d2f82a5ad52ffcbb0ef7\": container with ID starting with e37d4adb8d9d5ffbf813442c1fdbcfbbf0812ab4b1f3d2f82a5ad52ffcbb0ef7 not found: ID does not exist" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.432382 4792 scope.go:117] "RemoveContainer" containerID="0a8da701899d6dbb39ad32c67cda198a10066ca5649f3a3b10692182aafa59b8" Mar 01 09:30:30 crc kubenswrapper[4792]: E0301 09:30:30.436876 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a8da701899d6dbb39ad32c67cda198a10066ca5649f3a3b10692182aafa59b8\": container with ID starting with 0a8da701899d6dbb39ad32c67cda198a10066ca5649f3a3b10692182aafa59b8 not found: ID does not exist" containerID="0a8da701899d6dbb39ad32c67cda198a10066ca5649f3a3b10692182aafa59b8" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.437098 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a8da701899d6dbb39ad32c67cda198a10066ca5649f3a3b10692182aafa59b8"} err="failed to get container status \"0a8da701899d6dbb39ad32c67cda198a10066ca5649f3a3b10692182aafa59b8\": rpc error: code = NotFound desc = could not find container \"0a8da701899d6dbb39ad32c67cda198a10066ca5649f3a3b10692182aafa59b8\": container with ID starting with 0a8da701899d6dbb39ad32c67cda198a10066ca5649f3a3b10692182aafa59b8 not found: ID does not exist" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.437196 4792 scope.go:117] "RemoveContainer" containerID="43517df943397eb0f37738ca82db7645bf4ef1a52cc7a3b9599b08d521c29e03" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.444190 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.456454 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.475058 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.484347 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.501841 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 01 09:30:30 crc kubenswrapper[4792]: E0301 09:30:30.502385 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49690073-1340-4686-bc4b-f69901bb45d9" containerName="nova-api-log" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.502403 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="49690073-1340-4686-bc4b-f69901bb45d9" containerName="nova-api-log" Mar 01 09:30:30 crc kubenswrapper[4792]: E0301 09:30:30.502413 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49690073-1340-4686-bc4b-f69901bb45d9" containerName="nova-api-api" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.502419 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="49690073-1340-4686-bc4b-f69901bb45d9" containerName="nova-api-api" Mar 01 09:30:30 crc kubenswrapper[4792]: E0301 09:30:30.502444 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8015a6e-cf5d-4728-b2e6-66bb8960fd40" containerName="nova-scheduler-scheduler" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.502450 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8015a6e-cf5d-4728-b2e6-66bb8960fd40" containerName="nova-scheduler-scheduler" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.502632 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="49690073-1340-4686-bc4b-f69901bb45d9" containerName="nova-api-log" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.502643 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="49690073-1340-4686-bc4b-f69901bb45d9" containerName="nova-api-api" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.502661 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8015a6e-cf5d-4728-b2e6-66bb8960fd40" containerName="nova-scheduler-scheduler" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.503556 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.516178 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.522031 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.523418 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.525510 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.534511 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.541396 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.583082 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b5702fe-b26d-43ee-b702-4ac5527947cd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4b5702fe-b26d-43ee-b702-4ac5527947cd\") " pod="openstack/nova-scheduler-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.583159 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-config-data\") pod \"nova-api-0\" (UID: \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\") " pod="openstack/nova-api-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.583179 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcn88\" (UniqueName: \"kubernetes.io/projected/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-kube-api-access-lcn88\") pod \"nova-api-0\" (UID: \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\") " pod="openstack/nova-api-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.583203 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-logs\") pod \"nova-api-0\" (UID: \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\") " pod="openstack/nova-api-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.583255 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\") " pod="openstack/nova-api-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.583490 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b5702fe-b26d-43ee-b702-4ac5527947cd-config-data\") pod \"nova-scheduler-0\" (UID: \"4b5702fe-b26d-43ee-b702-4ac5527947cd\") " pod="openstack/nova-scheduler-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.583594 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4tl5\" (UniqueName: \"kubernetes.io/projected/4b5702fe-b26d-43ee-b702-4ac5527947cd-kube-api-access-k4tl5\") pod \"nova-scheduler-0\" (UID: \"4b5702fe-b26d-43ee-b702-4ac5527947cd\") " pod="openstack/nova-scheduler-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.685793 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\") " pod="openstack/nova-api-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.685880 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b5702fe-b26d-43ee-b702-4ac5527947cd-config-data\") pod \"nova-scheduler-0\" (UID: \"4b5702fe-b26d-43ee-b702-4ac5527947cd\") " pod="openstack/nova-scheduler-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.685932 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4tl5\" (UniqueName: \"kubernetes.io/projected/4b5702fe-b26d-43ee-b702-4ac5527947cd-kube-api-access-k4tl5\") pod \"nova-scheduler-0\" (UID: \"4b5702fe-b26d-43ee-b702-4ac5527947cd\") " pod="openstack/nova-scheduler-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.685965 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b5702fe-b26d-43ee-b702-4ac5527947cd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4b5702fe-b26d-43ee-b702-4ac5527947cd\") " pod="openstack/nova-scheduler-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.686015 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-config-data\") pod \"nova-api-0\" (UID: \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\") " pod="openstack/nova-api-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.686031 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcn88\" (UniqueName: \"kubernetes.io/projected/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-kube-api-access-lcn88\") pod \"nova-api-0\" (UID: \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\") " pod="openstack/nova-api-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.686053 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-logs\") pod \"nova-api-0\" (UID: \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\") " pod="openstack/nova-api-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.686838 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-logs\") pod \"nova-api-0\" (UID: \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\") " pod="openstack/nova-api-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.689725 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b5702fe-b26d-43ee-b702-4ac5527947cd-config-data\") pod \"nova-scheduler-0\" (UID: \"4b5702fe-b26d-43ee-b702-4ac5527947cd\") " pod="openstack/nova-scheduler-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.689851 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b5702fe-b26d-43ee-b702-4ac5527947cd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4b5702fe-b26d-43ee-b702-4ac5527947cd\") " pod="openstack/nova-scheduler-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.690747 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\") " pod="openstack/nova-api-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.694432 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-config-data\") pod \"nova-api-0\" (UID: \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\") " pod="openstack/nova-api-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.703366 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcn88\" (UniqueName: \"kubernetes.io/projected/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-kube-api-access-lcn88\") pod \"nova-api-0\" (UID: \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\") " pod="openstack/nova-api-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.703368 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4tl5\" (UniqueName: \"kubernetes.io/projected/4b5702fe-b26d-43ee-b702-4ac5527947cd-kube-api-access-k4tl5\") pod \"nova-scheduler-0\" (UID: \"4b5702fe-b26d-43ee-b702-4ac5527947cd\") " pod="openstack/nova-scheduler-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.834992 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.851747 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.972303 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 01 09:30:30 crc kubenswrapper[4792]: I0301 09:30:30.972626 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 01 09:30:31 crc kubenswrapper[4792]: I0301 09:30:31.256291 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 01 09:30:31 crc kubenswrapper[4792]: I0301 09:30:31.341319 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 01 09:30:31 crc kubenswrapper[4792]: W0301 09:30:31.349259 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b5702fe_b26d_43ee_b702_4ac5527947cd.slice/crio-100c00aaeb772803e831be7330ba512b669313e4b7b45f0161f7576070d6c332 WatchSource:0}: Error finding container 100c00aaeb772803e831be7330ba512b669313e4b7b45f0161f7576070d6c332: Status 404 returned error can't find the container with id 100c00aaeb772803e831be7330ba512b669313e4b7b45f0161f7576070d6c332 Mar 01 09:30:31 crc kubenswrapper[4792]: I0301 09:30:31.438332 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49690073-1340-4686-bc4b-f69901bb45d9" path="/var/lib/kubelet/pods/49690073-1340-4686-bc4b-f69901bb45d9/volumes" Mar 01 09:30:31 crc kubenswrapper[4792]: I0301 09:30:31.439755 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8015a6e-cf5d-4728-b2e6-66bb8960fd40" path="/var/lib/kubelet/pods/b8015a6e-cf5d-4728-b2e6-66bb8960fd40/volumes" Mar 01 09:30:31 crc kubenswrapper[4792]: I0301 09:30:31.440793 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5","Type":"ContainerStarted","Data":"89a1a9514a04ba2ef6114510db9082d4635321014208a8dffde8aafd68862a7c"} Mar 01 09:30:31 crc kubenswrapper[4792]: I0301 09:30:31.440854 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4b5702fe-b26d-43ee-b702-4ac5527947cd","Type":"ContainerStarted","Data":"100c00aaeb772803e831be7330ba512b669313e4b7b45f0161f7576070d6c332"} Mar 01 09:30:31 crc kubenswrapper[4792]: I0301 09:30:31.542474 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 01 09:30:32 crc kubenswrapper[4792]: I0301 09:30:32.465377 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5","Type":"ContainerStarted","Data":"59421f9fcdf393f55cf74f2eb4e16fc9e1ecf0f2e44e7dfd8974930450018cc7"} Mar 01 09:30:32 crc kubenswrapper[4792]: I0301 09:30:32.466592 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5","Type":"ContainerStarted","Data":"967bfda4431903ec2dd2aafcf91861e2b4d545ee88e602fcc124a7870529bbdf"} Mar 01 09:30:32 crc kubenswrapper[4792]: I0301 09:30:32.471717 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4b5702fe-b26d-43ee-b702-4ac5527947cd","Type":"ContainerStarted","Data":"1130ef58928c6df8139553b0ba2bbd03e66b69c09b50f9867cc8cd0a70ce6c1e"} Mar 01 09:30:32 crc kubenswrapper[4792]: I0301 09:30:32.498663 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.498643897 podStartE2EDuration="2.498643897s" podCreationTimestamp="2026-03-01 09:30:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:30:32.49669826 +0000 UTC m=+1361.738577467" watchObservedRunningTime="2026-03-01 09:30:32.498643897 +0000 UTC m=+1361.740523114" Mar 01 09:30:32 crc kubenswrapper[4792]: I0301 09:30:32.520095 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.52007489 podStartE2EDuration="2.52007489s" podCreationTimestamp="2026-03-01 09:30:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:30:32.518264317 +0000 UTC m=+1361.760143514" watchObservedRunningTime="2026-03-01 09:30:32.52007489 +0000 UTC m=+1361.761954097" Mar 01 09:30:33 crc kubenswrapper[4792]: I0301 09:30:33.582665 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 01 09:30:33 crc kubenswrapper[4792]: I0301 09:30:33.583160 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="c5db40bf-18aa-4877-ad92-35d50c549309" containerName="kube-state-metrics" containerID="cri-o://ab032429016daa4de09e0b350e6149a60f82fbef10f58da185abc90fde56991b" gracePeriod=30 Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.071729 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.163581 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5xkf\" (UniqueName: \"kubernetes.io/projected/c5db40bf-18aa-4877-ad92-35d50c549309-kube-api-access-t5xkf\") pod \"c5db40bf-18aa-4877-ad92-35d50c549309\" (UID: \"c5db40bf-18aa-4877-ad92-35d50c549309\") " Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.185626 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5db40bf-18aa-4877-ad92-35d50c549309-kube-api-access-t5xkf" (OuterVolumeSpecName: "kube-api-access-t5xkf") pod "c5db40bf-18aa-4877-ad92-35d50c549309" (UID: "c5db40bf-18aa-4877-ad92-35d50c549309"). InnerVolumeSpecName "kube-api-access-t5xkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.265759 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5xkf\" (UniqueName: \"kubernetes.io/projected/c5db40bf-18aa-4877-ad92-35d50c549309-kube-api-access-t5xkf\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.495345 4792 generic.go:334] "Generic (PLEG): container finished" podID="c5db40bf-18aa-4877-ad92-35d50c549309" containerID="ab032429016daa4de09e0b350e6149a60f82fbef10f58da185abc90fde56991b" exitCode=2 Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.495413 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c5db40bf-18aa-4877-ad92-35d50c549309","Type":"ContainerDied","Data":"ab032429016daa4de09e0b350e6149a60f82fbef10f58da185abc90fde56991b"} Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.495445 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c5db40bf-18aa-4877-ad92-35d50c549309","Type":"ContainerDied","Data":"d3de3b349ed8682aadffbec7a09f7bd847d16614859016d386affe481743f302"} Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.495465 4792 scope.go:117] "RemoveContainer" containerID="ab032429016daa4de09e0b350e6149a60f82fbef10f58da185abc90fde56991b" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.495700 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.524897 4792 scope.go:117] "RemoveContainer" containerID="ab032429016daa4de09e0b350e6149a60f82fbef10f58da185abc90fde56991b" Mar 01 09:30:34 crc kubenswrapper[4792]: E0301 09:30:34.526669 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab032429016daa4de09e0b350e6149a60f82fbef10f58da185abc90fde56991b\": container with ID starting with ab032429016daa4de09e0b350e6149a60f82fbef10f58da185abc90fde56991b not found: ID does not exist" containerID="ab032429016daa4de09e0b350e6149a60f82fbef10f58da185abc90fde56991b" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.526786 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab032429016daa4de09e0b350e6149a60f82fbef10f58da185abc90fde56991b"} err="failed to get container status \"ab032429016daa4de09e0b350e6149a60f82fbef10f58da185abc90fde56991b\": rpc error: code = NotFound desc = could not find container \"ab032429016daa4de09e0b350e6149a60f82fbef10f58da185abc90fde56991b\": container with ID starting with ab032429016daa4de09e0b350e6149a60f82fbef10f58da185abc90fde56991b not found: ID does not exist" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.544988 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.552944 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.575969 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 01 09:30:34 crc kubenswrapper[4792]: E0301 09:30:34.576621 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5db40bf-18aa-4877-ad92-35d50c549309" containerName="kube-state-metrics" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.576716 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5db40bf-18aa-4877-ad92-35d50c549309" containerName="kube-state-metrics" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.577014 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5db40bf-18aa-4877-ad92-35d50c549309" containerName="kube-state-metrics" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.577673 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.583206 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.583385 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.597582 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.673523 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f21d62f-3539-4d5d-aeaa-cc816a51d412-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6f21d62f-3539-4d5d-aeaa-cc816a51d412\") " pod="openstack/kube-state-metrics-0" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.673591 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6f21d62f-3539-4d5d-aeaa-cc816a51d412-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6f21d62f-3539-4d5d-aeaa-cc816a51d412\") " pod="openstack/kube-state-metrics-0" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.673628 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddbwq\" (UniqueName: \"kubernetes.io/projected/6f21d62f-3539-4d5d-aeaa-cc816a51d412-kube-api-access-ddbwq\") pod \"kube-state-metrics-0\" (UID: \"6f21d62f-3539-4d5d-aeaa-cc816a51d412\") " pod="openstack/kube-state-metrics-0" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.673717 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f21d62f-3539-4d5d-aeaa-cc816a51d412-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6f21d62f-3539-4d5d-aeaa-cc816a51d412\") " pod="openstack/kube-state-metrics-0" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.775129 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f21d62f-3539-4d5d-aeaa-cc816a51d412-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6f21d62f-3539-4d5d-aeaa-cc816a51d412\") " pod="openstack/kube-state-metrics-0" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.775200 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6f21d62f-3539-4d5d-aeaa-cc816a51d412-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6f21d62f-3539-4d5d-aeaa-cc816a51d412\") " pod="openstack/kube-state-metrics-0" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.775245 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddbwq\" (UniqueName: \"kubernetes.io/projected/6f21d62f-3539-4d5d-aeaa-cc816a51d412-kube-api-access-ddbwq\") pod \"kube-state-metrics-0\" (UID: \"6f21d62f-3539-4d5d-aeaa-cc816a51d412\") " pod="openstack/kube-state-metrics-0" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.775285 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f21d62f-3539-4d5d-aeaa-cc816a51d412-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6f21d62f-3539-4d5d-aeaa-cc816a51d412\") " pod="openstack/kube-state-metrics-0" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.782367 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f21d62f-3539-4d5d-aeaa-cc816a51d412-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6f21d62f-3539-4d5d-aeaa-cc816a51d412\") " pod="openstack/kube-state-metrics-0" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.785440 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6f21d62f-3539-4d5d-aeaa-cc816a51d412-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6f21d62f-3539-4d5d-aeaa-cc816a51d412\") " pod="openstack/kube-state-metrics-0" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.786646 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f21d62f-3539-4d5d-aeaa-cc816a51d412-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6f21d62f-3539-4d5d-aeaa-cc816a51d412\") " pod="openstack/kube-state-metrics-0" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.792340 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.792655 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerName="ceilometer-central-agent" containerID="cri-o://98dcb90013209d7c9aae298e15969a8ba76fb60c52fecc8c46812f9b8ef37321" gracePeriod=30 Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.792716 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerName="sg-core" containerID="cri-o://26d4385a852587fbc619f3d05dc7aef49f2ba1f691477d85d71993fa4441f74f" gracePeriod=30 Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.792780 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerName="ceilometer-notification-agent" containerID="cri-o://b17992a1fa202276e607ebb67128eedd9320a4b892cde2ad53d17b1a493bd1e6" gracePeriod=30 Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.792875 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerName="proxy-httpd" containerID="cri-o://ea052cff5c038f68038711c9c5471ef75635b4c2956cef2d713687d1ae0e9463" gracePeriod=30 Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.806753 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddbwq\" (UniqueName: \"kubernetes.io/projected/6f21d62f-3539-4d5d-aeaa-cc816a51d412-kube-api-access-ddbwq\") pod \"kube-state-metrics-0\" (UID: \"6f21d62f-3539-4d5d-aeaa-cc816a51d412\") " pod="openstack/kube-state-metrics-0" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.900219 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.943894 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.943971 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.944013 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.944769 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d3c51b46f8c635f0bd922e6a816c6cbadeb855fe42f4d60474cde44514e4e4de"} pod="openshift-machine-config-operator/machine-config-daemon-bqszv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 01 09:30:34 crc kubenswrapper[4792]: I0301 09:30:34.944824 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" containerID="cri-o://d3c51b46f8c635f0bd922e6a816c6cbadeb855fe42f4d60474cde44514e4e4de" gracePeriod=600 Mar 01 09:30:35 crc kubenswrapper[4792]: I0301 09:30:35.418120 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5db40bf-18aa-4877-ad92-35d50c549309" path="/var/lib/kubelet/pods/c5db40bf-18aa-4877-ad92-35d50c549309/volumes" Mar 01 09:30:35 crc kubenswrapper[4792]: W0301 09:30:35.464922 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f21d62f_3539_4d5d_aeaa_cc816a51d412.slice/crio-52b3df514e15fe47df835ef406e9d71bccbbe7d4ea940261945dc0eb99090cbf WatchSource:0}: Error finding container 52b3df514e15fe47df835ef406e9d71bccbbe7d4ea940261945dc0eb99090cbf: Status 404 returned error can't find the container with id 52b3df514e15fe47df835ef406e9d71bccbbe7d4ea940261945dc0eb99090cbf Mar 01 09:30:35 crc kubenswrapper[4792]: I0301 09:30:35.468950 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 01 09:30:35 crc kubenswrapper[4792]: I0301 09:30:35.505210 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6f21d62f-3539-4d5d-aeaa-cc816a51d412","Type":"ContainerStarted","Data":"52b3df514e15fe47df835ef406e9d71bccbbe7d4ea940261945dc0eb99090cbf"} Mar 01 09:30:35 crc kubenswrapper[4792]: I0301 09:30:35.508137 4792 generic.go:334] "Generic (PLEG): container finished" podID="9105f6b0-6f16-47aa-8009-73736a90b765" containerID="d3c51b46f8c635f0bd922e6a816c6cbadeb855fe42f4d60474cde44514e4e4de" exitCode=0 Mar 01 09:30:35 crc kubenswrapper[4792]: I0301 09:30:35.508207 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerDied","Data":"d3c51b46f8c635f0bd922e6a816c6cbadeb855fe42f4d60474cde44514e4e4de"} Mar 01 09:30:35 crc kubenswrapper[4792]: I0301 09:30:35.508249 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"6c6c336556c3895a23d652801049ab8fd2cc3ff89812dc0c31bb6831441e0e06"} Mar 01 09:30:35 crc kubenswrapper[4792]: I0301 09:30:35.508268 4792 scope.go:117] "RemoveContainer" containerID="9fc8b9702d9d3591695478729a2a209996e2f83219ba8649a31afc02f286ad3f" Mar 01 09:30:35 crc kubenswrapper[4792]: I0301 09:30:35.516866 4792 generic.go:334] "Generic (PLEG): container finished" podID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerID="ea052cff5c038f68038711c9c5471ef75635b4c2956cef2d713687d1ae0e9463" exitCode=0 Mar 01 09:30:35 crc kubenswrapper[4792]: I0301 09:30:35.516943 4792 generic.go:334] "Generic (PLEG): container finished" podID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerID="26d4385a852587fbc619f3d05dc7aef49f2ba1f691477d85d71993fa4441f74f" exitCode=2 Mar 01 09:30:35 crc kubenswrapper[4792]: I0301 09:30:35.516958 4792 generic.go:334] "Generic (PLEG): container finished" podID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerID="98dcb90013209d7c9aae298e15969a8ba76fb60c52fecc8c46812f9b8ef37321" exitCode=0 Mar 01 09:30:35 crc kubenswrapper[4792]: I0301 09:30:35.516989 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b9dfa7c-35ce-4f0d-9439-ed55e060a486","Type":"ContainerDied","Data":"ea052cff5c038f68038711c9c5471ef75635b4c2956cef2d713687d1ae0e9463"} Mar 01 09:30:35 crc kubenswrapper[4792]: I0301 09:30:35.517045 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b9dfa7c-35ce-4f0d-9439-ed55e060a486","Type":"ContainerDied","Data":"26d4385a852587fbc619f3d05dc7aef49f2ba1f691477d85d71993fa4441f74f"} Mar 01 09:30:35 crc kubenswrapper[4792]: I0301 09:30:35.517061 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b9dfa7c-35ce-4f0d-9439-ed55e060a486","Type":"ContainerDied","Data":"98dcb90013209d7c9aae298e15969a8ba76fb60c52fecc8c46812f9b8ef37321"} Mar 01 09:30:35 crc kubenswrapper[4792]: I0301 09:30:35.682352 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 01 09:30:35 crc kubenswrapper[4792]: I0301 09:30:35.852525 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 01 09:30:35 crc kubenswrapper[4792]: I0301 09:30:35.971984 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 01 09:30:35 crc kubenswrapper[4792]: I0301 09:30:35.972261 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 01 09:30:36 crc kubenswrapper[4792]: I0301 09:30:36.536811 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6f21d62f-3539-4d5d-aeaa-cc816a51d412","Type":"ContainerStarted","Data":"f0e9d810566d8958a1c8efced4ac0839665a299e441c1c47872026aff6a7c43c"} Mar 01 09:30:36 crc kubenswrapper[4792]: I0301 09:30:36.537458 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 01 09:30:36 crc kubenswrapper[4792]: I0301 09:30:36.564990 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.119348254 podStartE2EDuration="2.564971569s" podCreationTimestamp="2026-03-01 09:30:34 +0000 UTC" firstStartedPulling="2026-03-01 09:30:35.46736143 +0000 UTC m=+1364.709240627" lastFinishedPulling="2026-03-01 09:30:35.912984745 +0000 UTC m=+1365.154863942" observedRunningTime="2026-03-01 09:30:36.553274039 +0000 UTC m=+1365.795153236" watchObservedRunningTime="2026-03-01 09:30:36.564971569 +0000 UTC m=+1365.806850766" Mar 01 09:30:36 crc kubenswrapper[4792]: I0301 09:30:36.988084 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="66b40740-5f2c-4f3a-9d20-3307335829ed" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.186:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 01 09:30:36 crc kubenswrapper[4792]: I0301 09:30:36.988305 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="66b40740-5f2c-4f3a-9d20-3307335829ed" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.186:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 01 09:30:39 crc kubenswrapper[4792]: I0301 09:30:39.984413 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.102818 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54hjp\" (UniqueName: \"kubernetes.io/projected/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-kube-api-access-54hjp\") pod \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.102977 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-scripts\") pod \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.103031 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-sg-core-conf-yaml\") pod \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.103060 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-combined-ca-bundle\") pod \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.103097 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-run-httpd\") pod \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.103132 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-log-httpd\") pod \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.103202 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-config-data\") pod \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\" (UID: \"0b9dfa7c-35ce-4f0d-9439-ed55e060a486\") " Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.105522 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0b9dfa7c-35ce-4f0d-9439-ed55e060a486" (UID: "0b9dfa7c-35ce-4f0d-9439-ed55e060a486"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.105744 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0b9dfa7c-35ce-4f0d-9439-ed55e060a486" (UID: "0b9dfa7c-35ce-4f0d-9439-ed55e060a486"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.110199 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-scripts" (OuterVolumeSpecName: "scripts") pod "0b9dfa7c-35ce-4f0d-9439-ed55e060a486" (UID: "0b9dfa7c-35ce-4f0d-9439-ed55e060a486"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.117957 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-kube-api-access-54hjp" (OuterVolumeSpecName: "kube-api-access-54hjp") pod "0b9dfa7c-35ce-4f0d-9439-ed55e060a486" (UID: "0b9dfa7c-35ce-4f0d-9439-ed55e060a486"). InnerVolumeSpecName "kube-api-access-54hjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.152077 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0b9dfa7c-35ce-4f0d-9439-ed55e060a486" (UID: "0b9dfa7c-35ce-4f0d-9439-ed55e060a486"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.186735 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b9dfa7c-35ce-4f0d-9439-ed55e060a486" (UID: "0b9dfa7c-35ce-4f0d-9439-ed55e060a486"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.205263 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54hjp\" (UniqueName: \"kubernetes.io/projected/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-kube-api-access-54hjp\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.205416 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.205476 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.205551 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.205607 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.205661 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.212152 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-config-data" (OuterVolumeSpecName: "config-data") pod "0b9dfa7c-35ce-4f0d-9439-ed55e060a486" (UID: "0b9dfa7c-35ce-4f0d-9439-ed55e060a486"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.307162 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b9dfa7c-35ce-4f0d-9439-ed55e060a486-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.572566 4792 generic.go:334] "Generic (PLEG): container finished" podID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerID="b17992a1fa202276e607ebb67128eedd9320a4b892cde2ad53d17b1a493bd1e6" exitCode=0 Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.572608 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b9dfa7c-35ce-4f0d-9439-ed55e060a486","Type":"ContainerDied","Data":"b17992a1fa202276e607ebb67128eedd9320a4b892cde2ad53d17b1a493bd1e6"} Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.572640 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.572649 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b9dfa7c-35ce-4f0d-9439-ed55e060a486","Type":"ContainerDied","Data":"13d1178136245bb5d7f800642b1b34582d1e453c7a1057abd9a450d34e19ac11"} Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.572682 4792 scope.go:117] "RemoveContainer" containerID="ea052cff5c038f68038711c9c5471ef75635b4c2956cef2d713687d1ae0e9463" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.597216 4792 scope.go:117] "RemoveContainer" containerID="26d4385a852587fbc619f3d05dc7aef49f2ba1f691477d85d71993fa4441f74f" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.616163 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.643543 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.661480 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:30:40 crc kubenswrapper[4792]: E0301 09:30:40.661812 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerName="proxy-httpd" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.661833 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerName="proxy-httpd" Mar 01 09:30:40 crc kubenswrapper[4792]: E0301 09:30:40.661860 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerName="sg-core" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.661868 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerName="sg-core" Mar 01 09:30:40 crc kubenswrapper[4792]: E0301 09:30:40.661890 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerName="ceilometer-central-agent" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.661899 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerName="ceilometer-central-agent" Mar 01 09:30:40 crc kubenswrapper[4792]: E0301 09:30:40.662006 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerName="ceilometer-notification-agent" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.662015 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerName="ceilometer-notification-agent" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.662233 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerName="proxy-httpd" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.662267 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerName="ceilometer-central-agent" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.662278 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerName="sg-core" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.662288 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" containerName="ceilometer-notification-agent" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.663860 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.671740 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.671963 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.672096 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.682495 4792 scope.go:117] "RemoveContainer" containerID="b17992a1fa202276e607ebb67128eedd9320a4b892cde2ad53d17b1a493bd1e6" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.683443 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.730137 4792 scope.go:117] "RemoveContainer" containerID="98dcb90013209d7c9aae298e15969a8ba76fb60c52fecc8c46812f9b8ef37321" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.749856 4792 scope.go:117] "RemoveContainer" containerID="ea052cff5c038f68038711c9c5471ef75635b4c2956cef2d713687d1ae0e9463" Mar 01 09:30:40 crc kubenswrapper[4792]: E0301 09:30:40.750254 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea052cff5c038f68038711c9c5471ef75635b4c2956cef2d713687d1ae0e9463\": container with ID starting with ea052cff5c038f68038711c9c5471ef75635b4c2956cef2d713687d1ae0e9463 not found: ID does not exist" containerID="ea052cff5c038f68038711c9c5471ef75635b4c2956cef2d713687d1ae0e9463" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.750615 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea052cff5c038f68038711c9c5471ef75635b4c2956cef2d713687d1ae0e9463"} err="failed to get container status \"ea052cff5c038f68038711c9c5471ef75635b4c2956cef2d713687d1ae0e9463\": rpc error: code = NotFound desc = could not find container \"ea052cff5c038f68038711c9c5471ef75635b4c2956cef2d713687d1ae0e9463\": container with ID starting with ea052cff5c038f68038711c9c5471ef75635b4c2956cef2d713687d1ae0e9463 not found: ID does not exist" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.750637 4792 scope.go:117] "RemoveContainer" containerID="26d4385a852587fbc619f3d05dc7aef49f2ba1f691477d85d71993fa4441f74f" Mar 01 09:30:40 crc kubenswrapper[4792]: E0301 09:30:40.751056 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26d4385a852587fbc619f3d05dc7aef49f2ba1f691477d85d71993fa4441f74f\": container with ID starting with 26d4385a852587fbc619f3d05dc7aef49f2ba1f691477d85d71993fa4441f74f not found: ID does not exist" containerID="26d4385a852587fbc619f3d05dc7aef49f2ba1f691477d85d71993fa4441f74f" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.751112 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26d4385a852587fbc619f3d05dc7aef49f2ba1f691477d85d71993fa4441f74f"} err="failed to get container status \"26d4385a852587fbc619f3d05dc7aef49f2ba1f691477d85d71993fa4441f74f\": rpc error: code = NotFound desc = could not find container \"26d4385a852587fbc619f3d05dc7aef49f2ba1f691477d85d71993fa4441f74f\": container with ID starting with 26d4385a852587fbc619f3d05dc7aef49f2ba1f691477d85d71993fa4441f74f not found: ID does not exist" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.751155 4792 scope.go:117] "RemoveContainer" containerID="b17992a1fa202276e607ebb67128eedd9320a4b892cde2ad53d17b1a493bd1e6" Mar 01 09:30:40 crc kubenswrapper[4792]: E0301 09:30:40.753478 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b17992a1fa202276e607ebb67128eedd9320a4b892cde2ad53d17b1a493bd1e6\": container with ID starting with b17992a1fa202276e607ebb67128eedd9320a4b892cde2ad53d17b1a493bd1e6 not found: ID does not exist" containerID="b17992a1fa202276e607ebb67128eedd9320a4b892cde2ad53d17b1a493bd1e6" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.753503 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b17992a1fa202276e607ebb67128eedd9320a4b892cde2ad53d17b1a493bd1e6"} err="failed to get container status \"b17992a1fa202276e607ebb67128eedd9320a4b892cde2ad53d17b1a493bd1e6\": rpc error: code = NotFound desc = could not find container \"b17992a1fa202276e607ebb67128eedd9320a4b892cde2ad53d17b1a493bd1e6\": container with ID starting with b17992a1fa202276e607ebb67128eedd9320a4b892cde2ad53d17b1a493bd1e6 not found: ID does not exist" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.753516 4792 scope.go:117] "RemoveContainer" containerID="98dcb90013209d7c9aae298e15969a8ba76fb60c52fecc8c46812f9b8ef37321" Mar 01 09:30:40 crc kubenswrapper[4792]: E0301 09:30:40.753785 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98dcb90013209d7c9aae298e15969a8ba76fb60c52fecc8c46812f9b8ef37321\": container with ID starting with 98dcb90013209d7c9aae298e15969a8ba76fb60c52fecc8c46812f9b8ef37321 not found: ID does not exist" containerID="98dcb90013209d7c9aae298e15969a8ba76fb60c52fecc8c46812f9b8ef37321" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.753805 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98dcb90013209d7c9aae298e15969a8ba76fb60c52fecc8c46812f9b8ef37321"} err="failed to get container status \"98dcb90013209d7c9aae298e15969a8ba76fb60c52fecc8c46812f9b8ef37321\": rpc error: code = NotFound desc = could not find container \"98dcb90013209d7c9aae298e15969a8ba76fb60c52fecc8c46812f9b8ef37321\": container with ID starting with 98dcb90013209d7c9aae298e15969a8ba76fb60c52fecc8c46812f9b8ef37321 not found: ID does not exist" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.818041 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.818093 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb7pg\" (UniqueName: \"kubernetes.io/projected/29759876-6cc9-4695-b6a1-b0204c0eeefe-kube-api-access-nb7pg\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.818299 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-config-data\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.818377 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.818424 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29759876-6cc9-4695-b6a1-b0204c0eeefe-run-httpd\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.818613 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29759876-6cc9-4695-b6a1-b0204c0eeefe-log-httpd\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.818783 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-scripts\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.818843 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.836429 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.836482 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.853048 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.880792 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.920930 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.921014 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.921059 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb7pg\" (UniqueName: \"kubernetes.io/projected/29759876-6cc9-4695-b6a1-b0204c0eeefe-kube-api-access-nb7pg\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.921103 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-config-data\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.921123 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.921142 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29759876-6cc9-4695-b6a1-b0204c0eeefe-run-httpd\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.921209 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29759876-6cc9-4695-b6a1-b0204c0eeefe-log-httpd\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.921249 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-scripts\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.922845 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29759876-6cc9-4695-b6a1-b0204c0eeefe-run-httpd\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.923054 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29759876-6cc9-4695-b6a1-b0204c0eeefe-log-httpd\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.927595 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-scripts\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.929282 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-config-data\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.929751 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.929930 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.946877 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.957015 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb7pg\" (UniqueName: \"kubernetes.io/projected/29759876-6cc9-4695-b6a1-b0204c0eeefe-kube-api-access-nb7pg\") pod \"ceilometer-0\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " pod="openstack/ceilometer-0" Mar 01 09:30:40 crc kubenswrapper[4792]: I0301 09:30:40.987927 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:30:41 crc kubenswrapper[4792]: I0301 09:30:41.417684 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b9dfa7c-35ce-4f0d-9439-ed55e060a486" path="/var/lib/kubelet/pods/0b9dfa7c-35ce-4f0d-9439-ed55e060a486/volumes" Mar 01 09:30:41 crc kubenswrapper[4792]: I0301 09:30:41.455481 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:30:41 crc kubenswrapper[4792]: W0301 09:30:41.458449 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29759876_6cc9_4695_b6a1_b0204c0eeefe.slice/crio-af74a6693f05dc86b0605fd8f64e799df50bc1492dea119b6c7b89e5c5c846cf WatchSource:0}: Error finding container af74a6693f05dc86b0605fd8f64e799df50bc1492dea119b6c7b89e5c5c846cf: Status 404 returned error can't find the container with id af74a6693f05dc86b0605fd8f64e799df50bc1492dea119b6c7b89e5c5c846cf Mar 01 09:30:41 crc kubenswrapper[4792]: I0301 09:30:41.584216 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29759876-6cc9-4695-b6a1-b0204c0eeefe","Type":"ContainerStarted","Data":"af74a6693f05dc86b0605fd8f64e799df50bc1492dea119b6c7b89e5c5c846cf"} Mar 01 09:30:41 crc kubenswrapper[4792]: I0301 09:30:41.635447 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 01 09:30:41 crc kubenswrapper[4792]: I0301 09:30:41.920440 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a9e82dc0-29af-47c8-bbef-1fd4bb999ff5" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.187:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 01 09:30:41 crc kubenswrapper[4792]: I0301 09:30:41.920594 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a9e82dc0-29af-47c8-bbef-1fd4bb999ff5" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.187:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 01 09:30:42 crc kubenswrapper[4792]: I0301 09:30:42.615114 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29759876-6cc9-4695-b6a1-b0204c0eeefe","Type":"ContainerStarted","Data":"527b30377b6256714a4388f7e23bcabdd65f592f13d11e6b52ac05c9342bfd2a"} Mar 01 09:30:43 crc kubenswrapper[4792]: I0301 09:30:43.627410 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29759876-6cc9-4695-b6a1-b0204c0eeefe","Type":"ContainerStarted","Data":"34964964d66ee80be4a7e87aa9e1e71eff65ac803c33a8d8654d01bf69a0f013"} Mar 01 09:30:43 crc kubenswrapper[4792]: I0301 09:30:43.627724 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29759876-6cc9-4695-b6a1-b0204c0eeefe","Type":"ContainerStarted","Data":"27961c30989977627bc62cffbc34bfcddc0bf83961913bcb7f03340031c18bf3"} Mar 01 09:30:44 crc kubenswrapper[4792]: I0301 09:30:44.910470 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 01 09:30:45 crc kubenswrapper[4792]: I0301 09:30:45.646717 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29759876-6cc9-4695-b6a1-b0204c0eeefe","Type":"ContainerStarted","Data":"ebc7b83a7e8bbf8a5ebc283245e6f34e1228e9544a5c7d8cdde9491f54d017af"} Mar 01 09:30:45 crc kubenswrapper[4792]: I0301 09:30:45.647074 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 01 09:30:45 crc kubenswrapper[4792]: I0301 09:30:45.674689 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.977116352 podStartE2EDuration="5.674497247s" podCreationTimestamp="2026-03-01 09:30:40 +0000 UTC" firstStartedPulling="2026-03-01 09:30:41.460473853 +0000 UTC m=+1370.702353040" lastFinishedPulling="2026-03-01 09:30:45.157854748 +0000 UTC m=+1374.399733935" observedRunningTime="2026-03-01 09:30:45.666900614 +0000 UTC m=+1374.908779811" watchObservedRunningTime="2026-03-01 09:30:45.674497247 +0000 UTC m=+1374.916376434" Mar 01 09:30:45 crc kubenswrapper[4792]: I0301 09:30:45.976995 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 01 09:30:45 crc kubenswrapper[4792]: I0301 09:30:45.977073 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 01 09:30:45 crc kubenswrapper[4792]: I0301 09:30:45.990636 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 01 09:30:45 crc kubenswrapper[4792]: I0301 09:30:45.999470 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 01 09:30:48 crc kubenswrapper[4792]: I0301 09:30:48.530062 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:48 crc kubenswrapper[4792]: I0301 09:30:48.676531 4792 generic.go:334] "Generic (PLEG): container finished" podID="2b24b4c8-4f85-4eae-93c7-3249c1a54f09" containerID="b93abbe99982a2c4da73af61051b2d1996a55ea6924dc1c336ede901534900b5" exitCode=137 Mar 01 09:30:48 crc kubenswrapper[4792]: I0301 09:30:48.676573 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2b24b4c8-4f85-4eae-93c7-3249c1a54f09","Type":"ContainerDied","Data":"b93abbe99982a2c4da73af61051b2d1996a55ea6924dc1c336ede901534900b5"} Mar 01 09:30:48 crc kubenswrapper[4792]: I0301 09:30:48.676599 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2b24b4c8-4f85-4eae-93c7-3249c1a54f09","Type":"ContainerDied","Data":"666f673dbf785249e4b855230a831650c499d51cb9413297a868fb5bc1afca52"} Mar 01 09:30:48 crc kubenswrapper[4792]: I0301 09:30:48.676615 4792 scope.go:117] "RemoveContainer" containerID="b93abbe99982a2c4da73af61051b2d1996a55ea6924dc1c336ede901534900b5" Mar 01 09:30:48 crc kubenswrapper[4792]: I0301 09:30:48.676724 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:48 crc kubenswrapper[4792]: I0301 09:30:48.692340 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b24b4c8-4f85-4eae-93c7-3249c1a54f09-config-data\") pod \"2b24b4c8-4f85-4eae-93c7-3249c1a54f09\" (UID: \"2b24b4c8-4f85-4eae-93c7-3249c1a54f09\") " Mar 01 09:30:48 crc kubenswrapper[4792]: I0301 09:30:48.692391 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b24b4c8-4f85-4eae-93c7-3249c1a54f09-combined-ca-bundle\") pod \"2b24b4c8-4f85-4eae-93c7-3249c1a54f09\" (UID: \"2b24b4c8-4f85-4eae-93c7-3249c1a54f09\") " Mar 01 09:30:48 crc kubenswrapper[4792]: I0301 09:30:48.692481 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s2hb\" (UniqueName: \"kubernetes.io/projected/2b24b4c8-4f85-4eae-93c7-3249c1a54f09-kube-api-access-5s2hb\") pod \"2b24b4c8-4f85-4eae-93c7-3249c1a54f09\" (UID: \"2b24b4c8-4f85-4eae-93c7-3249c1a54f09\") " Mar 01 09:30:48 crc kubenswrapper[4792]: I0301 09:30:48.698769 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b24b4c8-4f85-4eae-93c7-3249c1a54f09-kube-api-access-5s2hb" (OuterVolumeSpecName: "kube-api-access-5s2hb") pod "2b24b4c8-4f85-4eae-93c7-3249c1a54f09" (UID: "2b24b4c8-4f85-4eae-93c7-3249c1a54f09"). InnerVolumeSpecName "kube-api-access-5s2hb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:30:48 crc kubenswrapper[4792]: I0301 09:30:48.700033 4792 scope.go:117] "RemoveContainer" containerID="b93abbe99982a2c4da73af61051b2d1996a55ea6924dc1c336ede901534900b5" Mar 01 09:30:48 crc kubenswrapper[4792]: E0301 09:30:48.700506 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b93abbe99982a2c4da73af61051b2d1996a55ea6924dc1c336ede901534900b5\": container with ID starting with b93abbe99982a2c4da73af61051b2d1996a55ea6924dc1c336ede901534900b5 not found: ID does not exist" containerID="b93abbe99982a2c4da73af61051b2d1996a55ea6924dc1c336ede901534900b5" Mar 01 09:30:48 crc kubenswrapper[4792]: I0301 09:30:48.700596 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b93abbe99982a2c4da73af61051b2d1996a55ea6924dc1c336ede901534900b5"} err="failed to get container status \"b93abbe99982a2c4da73af61051b2d1996a55ea6924dc1c336ede901534900b5\": rpc error: code = NotFound desc = could not find container \"b93abbe99982a2c4da73af61051b2d1996a55ea6924dc1c336ede901534900b5\": container with ID starting with b93abbe99982a2c4da73af61051b2d1996a55ea6924dc1c336ede901534900b5 not found: ID does not exist" Mar 01 09:30:48 crc kubenswrapper[4792]: I0301 09:30:48.724550 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b24b4c8-4f85-4eae-93c7-3249c1a54f09-config-data" (OuterVolumeSpecName: "config-data") pod "2b24b4c8-4f85-4eae-93c7-3249c1a54f09" (UID: "2b24b4c8-4f85-4eae-93c7-3249c1a54f09"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:48 crc kubenswrapper[4792]: I0301 09:30:48.735631 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b24b4c8-4f85-4eae-93c7-3249c1a54f09-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b24b4c8-4f85-4eae-93c7-3249c1a54f09" (UID: "2b24b4c8-4f85-4eae-93c7-3249c1a54f09"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:48 crc kubenswrapper[4792]: I0301 09:30:48.794697 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b24b4c8-4f85-4eae-93c7-3249c1a54f09-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:48 crc kubenswrapper[4792]: I0301 09:30:48.794731 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b24b4c8-4f85-4eae-93c7-3249c1a54f09-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:48 crc kubenswrapper[4792]: I0301 09:30:48.794742 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s2hb\" (UniqueName: \"kubernetes.io/projected/2b24b4c8-4f85-4eae-93c7-3249c1a54f09-kube-api-access-5s2hb\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.007841 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.017208 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.038751 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 01 09:30:49 crc kubenswrapper[4792]: E0301 09:30:49.039082 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b24b4c8-4f85-4eae-93c7-3249c1a54f09" containerName="nova-cell1-novncproxy-novncproxy" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.039097 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b24b4c8-4f85-4eae-93c7-3249c1a54f09" containerName="nova-cell1-novncproxy-novncproxy" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.039289 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b24b4c8-4f85-4eae-93c7-3249c1a54f09" containerName="nova-cell1-novncproxy-novncproxy" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.039794 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.041443 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.041617 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.041831 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.056216 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.098990 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63afaac7-c934-4410-b2b5-ab04ad085489-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"63afaac7-c934-4410-b2b5-ab04ad085489\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.099089 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5ztz\" (UniqueName: \"kubernetes.io/projected/63afaac7-c934-4410-b2b5-ab04ad085489-kube-api-access-s5ztz\") pod \"nova-cell1-novncproxy-0\" (UID: \"63afaac7-c934-4410-b2b5-ab04ad085489\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.099132 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63afaac7-c934-4410-b2b5-ab04ad085489-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"63afaac7-c934-4410-b2b5-ab04ad085489\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.099203 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/63afaac7-c934-4410-b2b5-ab04ad085489-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"63afaac7-c934-4410-b2b5-ab04ad085489\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.099263 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/63afaac7-c934-4410-b2b5-ab04ad085489-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"63afaac7-c934-4410-b2b5-ab04ad085489\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.200601 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/63afaac7-c934-4410-b2b5-ab04ad085489-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"63afaac7-c934-4410-b2b5-ab04ad085489\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.200974 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/63afaac7-c934-4410-b2b5-ab04ad085489-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"63afaac7-c934-4410-b2b5-ab04ad085489\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.201114 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63afaac7-c934-4410-b2b5-ab04ad085489-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"63afaac7-c934-4410-b2b5-ab04ad085489\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.201253 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5ztz\" (UniqueName: \"kubernetes.io/projected/63afaac7-c934-4410-b2b5-ab04ad085489-kube-api-access-s5ztz\") pod \"nova-cell1-novncproxy-0\" (UID: \"63afaac7-c934-4410-b2b5-ab04ad085489\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.201382 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63afaac7-c934-4410-b2b5-ab04ad085489-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"63afaac7-c934-4410-b2b5-ab04ad085489\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.204431 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/63afaac7-c934-4410-b2b5-ab04ad085489-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"63afaac7-c934-4410-b2b5-ab04ad085489\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.204471 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/63afaac7-c934-4410-b2b5-ab04ad085489-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"63afaac7-c934-4410-b2b5-ab04ad085489\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.205119 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63afaac7-c934-4410-b2b5-ab04ad085489-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"63afaac7-c934-4410-b2b5-ab04ad085489\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.205493 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63afaac7-c934-4410-b2b5-ab04ad085489-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"63afaac7-c934-4410-b2b5-ab04ad085489\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.221607 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5ztz\" (UniqueName: \"kubernetes.io/projected/63afaac7-c934-4410-b2b5-ab04ad085489-kube-api-access-s5ztz\") pod \"nova-cell1-novncproxy-0\" (UID: \"63afaac7-c934-4410-b2b5-ab04ad085489\") " pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.355635 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.460256 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b24b4c8-4f85-4eae-93c7-3249c1a54f09" path="/var/lib/kubelet/pods/2b24b4c8-4f85-4eae-93c7-3249c1a54f09/volumes" Mar 01 09:30:49 crc kubenswrapper[4792]: I0301 09:30:49.866095 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 01 09:30:49 crc kubenswrapper[4792]: W0301 09:30:49.870741 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63afaac7_c934_4410_b2b5_ab04ad085489.slice/crio-9e55529456f0c08af6f0894b4440664e6083bc44293a3fb2e38ce9051ac65d5b WatchSource:0}: Error finding container 9e55529456f0c08af6f0894b4440664e6083bc44293a3fb2e38ce9051ac65d5b: Status 404 returned error can't find the container with id 9e55529456f0c08af6f0894b4440664e6083bc44293a3fb2e38ce9051ac65d5b Mar 01 09:30:50 crc kubenswrapper[4792]: I0301 09:30:50.696003 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"63afaac7-c934-4410-b2b5-ab04ad085489","Type":"ContainerStarted","Data":"4d99695ffa50fb2178db1d19dd03219e4bc573c6f2cc98fa8df1e40c4d180dcb"} Mar 01 09:30:50 crc kubenswrapper[4792]: I0301 09:30:50.696339 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"63afaac7-c934-4410-b2b5-ab04ad085489","Type":"ContainerStarted","Data":"9e55529456f0c08af6f0894b4440664e6083bc44293a3fb2e38ce9051ac65d5b"} Mar 01 09:30:50 crc kubenswrapper[4792]: I0301 09:30:50.731396 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.7313786009999999 podStartE2EDuration="1.731378601s" podCreationTimestamp="2026-03-01 09:30:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:30:50.726602437 +0000 UTC m=+1379.968481634" watchObservedRunningTime="2026-03-01 09:30:50.731378601 +0000 UTC m=+1379.973257798" Mar 01 09:30:50 crc kubenswrapper[4792]: I0301 09:30:50.976471 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 01 09:30:50 crc kubenswrapper[4792]: I0301 09:30:50.977408 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 01 09:30:51 crc kubenswrapper[4792]: I0301 09:30:51.091616 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 01 09:30:51 crc kubenswrapper[4792]: I0301 09:30:51.094191 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 01 09:30:51 crc kubenswrapper[4792]: I0301 09:30:51.705585 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 01 09:30:51 crc kubenswrapper[4792]: I0301 09:30:51.719758 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 01 09:30:51 crc kubenswrapper[4792]: I0301 09:30:51.936667 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c74598c69-2pgch"] Mar 01 09:30:51 crc kubenswrapper[4792]: I0301 09:30:51.942190 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:30:51 crc kubenswrapper[4792]: I0301 09:30:51.973936 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c74598c69-2pgch"] Mar 01 09:30:52 crc kubenswrapper[4792]: I0301 09:30:52.070850 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-dns-svc\") pod \"dnsmasq-dns-6c74598c69-2pgch\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:30:52 crc kubenswrapper[4792]: I0301 09:30:52.070960 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-ovsdbserver-sb\") pod \"dnsmasq-dns-6c74598c69-2pgch\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:30:52 crc kubenswrapper[4792]: I0301 09:30:52.071058 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-config\") pod \"dnsmasq-dns-6c74598c69-2pgch\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:30:52 crc kubenswrapper[4792]: I0301 09:30:52.072227 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-ovsdbserver-nb\") pod \"dnsmasq-dns-6c74598c69-2pgch\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:30:52 crc kubenswrapper[4792]: I0301 09:30:52.072266 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drdkm\" (UniqueName: \"kubernetes.io/projected/3dc02dae-4469-4e20-aca1-c85d7e451b7f-kube-api-access-drdkm\") pod \"dnsmasq-dns-6c74598c69-2pgch\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:30:52 crc kubenswrapper[4792]: I0301 09:30:52.173776 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-ovsdbserver-nb\") pod \"dnsmasq-dns-6c74598c69-2pgch\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:30:52 crc kubenswrapper[4792]: I0301 09:30:52.173848 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drdkm\" (UniqueName: \"kubernetes.io/projected/3dc02dae-4469-4e20-aca1-c85d7e451b7f-kube-api-access-drdkm\") pod \"dnsmasq-dns-6c74598c69-2pgch\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:30:52 crc kubenswrapper[4792]: I0301 09:30:52.173898 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-dns-svc\") pod \"dnsmasq-dns-6c74598c69-2pgch\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:30:52 crc kubenswrapper[4792]: I0301 09:30:52.174002 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-ovsdbserver-sb\") pod \"dnsmasq-dns-6c74598c69-2pgch\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:30:52 crc kubenswrapper[4792]: I0301 09:30:52.174122 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-config\") pod \"dnsmasq-dns-6c74598c69-2pgch\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:30:52 crc kubenswrapper[4792]: I0301 09:30:52.174729 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-ovsdbserver-nb\") pod \"dnsmasq-dns-6c74598c69-2pgch\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:30:52 crc kubenswrapper[4792]: I0301 09:30:52.174852 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-dns-svc\") pod \"dnsmasq-dns-6c74598c69-2pgch\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:30:52 crc kubenswrapper[4792]: I0301 09:30:52.174896 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-ovsdbserver-sb\") pod \"dnsmasq-dns-6c74598c69-2pgch\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:30:52 crc kubenswrapper[4792]: I0301 09:30:52.175024 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-config\") pod \"dnsmasq-dns-6c74598c69-2pgch\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:30:52 crc kubenswrapper[4792]: I0301 09:30:52.197596 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drdkm\" (UniqueName: \"kubernetes.io/projected/3dc02dae-4469-4e20-aca1-c85d7e451b7f-kube-api-access-drdkm\") pod \"dnsmasq-dns-6c74598c69-2pgch\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:30:52 crc kubenswrapper[4792]: I0301 09:30:52.261699 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:30:52 crc kubenswrapper[4792]: I0301 09:30:52.764972 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c74598c69-2pgch"] Mar 01 09:30:52 crc kubenswrapper[4792]: W0301 09:30:52.771154 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dc02dae_4469_4e20_aca1_c85d7e451b7f.slice/crio-6c7fd81ed9af7972987f7c71c071457fa812a6cd3376e81094962a9f69854811 WatchSource:0}: Error finding container 6c7fd81ed9af7972987f7c71c071457fa812a6cd3376e81094962a9f69854811: Status 404 returned error can't find the container with id 6c7fd81ed9af7972987f7c71c071457fa812a6cd3376e81094962a9f69854811 Mar 01 09:30:53 crc kubenswrapper[4792]: I0301 09:30:53.722530 4792 generic.go:334] "Generic (PLEG): container finished" podID="3dc02dae-4469-4e20-aca1-c85d7e451b7f" containerID="e23f1b79cd43eb2011bd766bf49f85132d9c752379520cf1459e680d0fc6f48c" exitCode=0 Mar 01 09:30:53 crc kubenswrapper[4792]: I0301 09:30:53.722681 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c74598c69-2pgch" event={"ID":"3dc02dae-4469-4e20-aca1-c85d7e451b7f","Type":"ContainerDied","Data":"e23f1b79cd43eb2011bd766bf49f85132d9c752379520cf1459e680d0fc6f48c"} Mar 01 09:30:53 crc kubenswrapper[4792]: I0301 09:30:53.722868 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c74598c69-2pgch" event={"ID":"3dc02dae-4469-4e20-aca1-c85d7e451b7f","Type":"ContainerStarted","Data":"6c7fd81ed9af7972987f7c71c071457fa812a6cd3376e81094962a9f69854811"} Mar 01 09:30:54 crc kubenswrapper[4792]: I0301 09:30:54.356606 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:54 crc kubenswrapper[4792]: I0301 09:30:54.515269 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 01 09:30:54 crc kubenswrapper[4792]: I0301 09:30:54.578788 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:30:54 crc kubenswrapper[4792]: I0301 09:30:54.583317 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerName="ceilometer-central-agent" containerID="cri-o://527b30377b6256714a4388f7e23bcabdd65f592f13d11e6b52ac05c9342bfd2a" gracePeriod=30 Mar 01 09:30:54 crc kubenswrapper[4792]: I0301 09:30:54.583722 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerName="sg-core" containerID="cri-o://34964964d66ee80be4a7e87aa9e1e71eff65ac803c33a8d8654d01bf69a0f013" gracePeriod=30 Mar 01 09:30:54 crc kubenswrapper[4792]: I0301 09:30:54.583774 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerName="ceilometer-notification-agent" containerID="cri-o://27961c30989977627bc62cffbc34bfcddc0bf83961913bcb7f03340031c18bf3" gracePeriod=30 Mar 01 09:30:54 crc kubenswrapper[4792]: I0301 09:30:54.584183 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerName="proxy-httpd" containerID="cri-o://ebc7b83a7e8bbf8a5ebc283245e6f34e1228e9544a5c7d8cdde9491f54d017af" gracePeriod=30 Mar 01 09:30:54 crc kubenswrapper[4792]: I0301 09:30:54.733738 4792 generic.go:334] "Generic (PLEG): container finished" podID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerID="34964964d66ee80be4a7e87aa9e1e71eff65ac803c33a8d8654d01bf69a0f013" exitCode=2 Mar 01 09:30:54 crc kubenswrapper[4792]: I0301 09:30:54.733782 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29759876-6cc9-4695-b6a1-b0204c0eeefe","Type":"ContainerDied","Data":"34964964d66ee80be4a7e87aa9e1e71eff65ac803c33a8d8654d01bf69a0f013"} Mar 01 09:30:54 crc kubenswrapper[4792]: I0301 09:30:54.735562 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c74598c69-2pgch" event={"ID":"3dc02dae-4469-4e20-aca1-c85d7e451b7f","Type":"ContainerStarted","Data":"a09dbf98a35444aba5d4b5bed5ccc4f027fcae4773128cc7e35e38da6f0a3954"} Mar 01 09:30:54 crc kubenswrapper[4792]: I0301 09:30:54.735656 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a9e82dc0-29af-47c8-bbef-1fd4bb999ff5" containerName="nova-api-log" containerID="cri-o://967bfda4431903ec2dd2aafcf91861e2b4d545ee88e602fcc124a7870529bbdf" gracePeriod=30 Mar 01 09:30:54 crc kubenswrapper[4792]: I0301 09:30:54.735765 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a9e82dc0-29af-47c8-bbef-1fd4bb999ff5" containerName="nova-api-api" containerID="cri-o://59421f9fcdf393f55cf74f2eb4e16fc9e1ecf0f2e44e7dfd8974930450018cc7" gracePeriod=30 Mar 01 09:30:54 crc kubenswrapper[4792]: I0301 09:30:54.758702 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c74598c69-2pgch" podStartSLOduration=3.758682628 podStartE2EDuration="3.758682628s" podCreationTimestamp="2026-03-01 09:30:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:30:54.752360676 +0000 UTC m=+1383.994239873" watchObservedRunningTime="2026-03-01 09:30:54.758682628 +0000 UTC m=+1384.000561825" Mar 01 09:30:55 crc kubenswrapper[4792]: I0301 09:30:55.749211 4792 generic.go:334] "Generic (PLEG): container finished" podID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerID="ebc7b83a7e8bbf8a5ebc283245e6f34e1228e9544a5c7d8cdde9491f54d017af" exitCode=0 Mar 01 09:30:55 crc kubenswrapper[4792]: I0301 09:30:55.749580 4792 generic.go:334] "Generic (PLEG): container finished" podID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerID="527b30377b6256714a4388f7e23bcabdd65f592f13d11e6b52ac05c9342bfd2a" exitCode=0 Mar 01 09:30:55 crc kubenswrapper[4792]: I0301 09:30:55.749274 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29759876-6cc9-4695-b6a1-b0204c0eeefe","Type":"ContainerDied","Data":"ebc7b83a7e8bbf8a5ebc283245e6f34e1228e9544a5c7d8cdde9491f54d017af"} Mar 01 09:30:55 crc kubenswrapper[4792]: I0301 09:30:55.749660 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29759876-6cc9-4695-b6a1-b0204c0eeefe","Type":"ContainerDied","Data":"527b30377b6256714a4388f7e23bcabdd65f592f13d11e6b52ac05c9342bfd2a"} Mar 01 09:30:55 crc kubenswrapper[4792]: I0301 09:30:55.752264 4792 generic.go:334] "Generic (PLEG): container finished" podID="a9e82dc0-29af-47c8-bbef-1fd4bb999ff5" containerID="967bfda4431903ec2dd2aafcf91861e2b4d545ee88e602fcc124a7870529bbdf" exitCode=143 Mar 01 09:30:55 crc kubenswrapper[4792]: I0301 09:30:55.752403 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5","Type":"ContainerDied","Data":"967bfda4431903ec2dd2aafcf91861e2b4d545ee88e602fcc124a7870529bbdf"} Mar 01 09:30:55 crc kubenswrapper[4792]: I0301 09:30:55.752740 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.201793 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.256628 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-sg-core-conf-yaml\") pod \"29759876-6cc9-4695-b6a1-b0204c0eeefe\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.256674 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nb7pg\" (UniqueName: \"kubernetes.io/projected/29759876-6cc9-4695-b6a1-b0204c0eeefe-kube-api-access-nb7pg\") pod \"29759876-6cc9-4695-b6a1-b0204c0eeefe\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.256714 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-ceilometer-tls-certs\") pod \"29759876-6cc9-4695-b6a1-b0204c0eeefe\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.256749 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-config-data\") pod \"29759876-6cc9-4695-b6a1-b0204c0eeefe\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.256774 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29759876-6cc9-4695-b6a1-b0204c0eeefe-run-httpd\") pod \"29759876-6cc9-4695-b6a1-b0204c0eeefe\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.256814 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-scripts\") pod \"29759876-6cc9-4695-b6a1-b0204c0eeefe\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.256856 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-combined-ca-bundle\") pod \"29759876-6cc9-4695-b6a1-b0204c0eeefe\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.256882 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29759876-6cc9-4695-b6a1-b0204c0eeefe-log-httpd\") pod \"29759876-6cc9-4695-b6a1-b0204c0eeefe\" (UID: \"29759876-6cc9-4695-b6a1-b0204c0eeefe\") " Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.257161 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29759876-6cc9-4695-b6a1-b0204c0eeefe-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "29759876-6cc9-4695-b6a1-b0204c0eeefe" (UID: "29759876-6cc9-4695-b6a1-b0204c0eeefe"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.257484 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29759876-6cc9-4695-b6a1-b0204c0eeefe-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "29759876-6cc9-4695-b6a1-b0204c0eeefe" (UID: "29759876-6cc9-4695-b6a1-b0204c0eeefe"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.267818 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29759876-6cc9-4695-b6a1-b0204c0eeefe-kube-api-access-nb7pg" (OuterVolumeSpecName: "kube-api-access-nb7pg") pod "29759876-6cc9-4695-b6a1-b0204c0eeefe" (UID: "29759876-6cc9-4695-b6a1-b0204c0eeefe"). InnerVolumeSpecName "kube-api-access-nb7pg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.295579 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-scripts" (OuterVolumeSpecName: "scripts") pod "29759876-6cc9-4695-b6a1-b0204c0eeefe" (UID: "29759876-6cc9-4695-b6a1-b0204c0eeefe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.365797 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "29759876-6cc9-4695-b6a1-b0204c0eeefe" (UID: "29759876-6cc9-4695-b6a1-b0204c0eeefe"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.367334 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "29759876-6cc9-4695-b6a1-b0204c0eeefe" (UID: "29759876-6cc9-4695-b6a1-b0204c0eeefe"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.370634 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29759876-6cc9-4695-b6a1-b0204c0eeefe-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.370678 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.370691 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29759876-6cc9-4695-b6a1-b0204c0eeefe-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.370702 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nb7pg\" (UniqueName: \"kubernetes.io/projected/29759876-6cc9-4695-b6a1-b0204c0eeefe-kube-api-access-nb7pg\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.405658 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29759876-6cc9-4695-b6a1-b0204c0eeefe" (UID: "29759876-6cc9-4695-b6a1-b0204c0eeefe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.435287 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-config-data" (OuterVolumeSpecName: "config-data") pod "29759876-6cc9-4695-b6a1-b0204c0eeefe" (UID: "29759876-6cc9-4695-b6a1-b0204c0eeefe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.472087 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.472123 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.472138 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.472146 4792 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/29759876-6cc9-4695-b6a1-b0204c0eeefe-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.764712 4792 generic.go:334] "Generic (PLEG): container finished" podID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerID="27961c30989977627bc62cffbc34bfcddc0bf83961913bcb7f03340031c18bf3" exitCode=0 Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.764739 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29759876-6cc9-4695-b6a1-b0204c0eeefe","Type":"ContainerDied","Data":"27961c30989977627bc62cffbc34bfcddc0bf83961913bcb7f03340031c18bf3"} Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.764787 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29759876-6cc9-4695-b6a1-b0204c0eeefe","Type":"ContainerDied","Data":"af74a6693f05dc86b0605fd8f64e799df50bc1492dea119b6c7b89e5c5c846cf"} Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.764805 4792 scope.go:117] "RemoveContainer" containerID="ebc7b83a7e8bbf8a5ebc283245e6f34e1228e9544a5c7d8cdde9491f54d017af" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.764813 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.789081 4792 scope.go:117] "RemoveContainer" containerID="34964964d66ee80be4a7e87aa9e1e71eff65ac803c33a8d8654d01bf69a0f013" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.799147 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.811445 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.820747 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:30:56 crc kubenswrapper[4792]: E0301 09:30:56.821293 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerName="ceilometer-central-agent" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.821372 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerName="ceilometer-central-agent" Mar 01 09:30:56 crc kubenswrapper[4792]: E0301 09:30:56.821444 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerName="ceilometer-notification-agent" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.821501 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerName="ceilometer-notification-agent" Mar 01 09:30:56 crc kubenswrapper[4792]: E0301 09:30:56.821572 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerName="proxy-httpd" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.821628 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerName="proxy-httpd" Mar 01 09:30:56 crc kubenswrapper[4792]: E0301 09:30:56.821691 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerName="sg-core" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.821741 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerName="sg-core" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.821972 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerName="ceilometer-notification-agent" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.822036 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerName="proxy-httpd" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.822105 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerName="sg-core" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.822167 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="29759876-6cc9-4695-b6a1-b0204c0eeefe" containerName="ceilometer-central-agent" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.824557 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.826935 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.827377 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.827498 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.842896 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.865486 4792 scope.go:117] "RemoveContainer" containerID="27961c30989977627bc62cffbc34bfcddc0bf83961913bcb7f03340031c18bf3" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.878950 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.879027 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.879133 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-scripts\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.879166 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4871267-e63c-4804-a404-869a0fdbd171-run-httpd\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.879199 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl6tg\" (UniqueName: \"kubernetes.io/projected/a4871267-e63c-4804-a404-869a0fdbd171-kube-api-access-cl6tg\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.879270 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-config-data\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.879315 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.879338 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4871267-e63c-4804-a404-869a0fdbd171-log-httpd\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.894929 4792 scope.go:117] "RemoveContainer" containerID="527b30377b6256714a4388f7e23bcabdd65f592f13d11e6b52ac05c9342bfd2a" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.915264 4792 scope.go:117] "RemoveContainer" containerID="ebc7b83a7e8bbf8a5ebc283245e6f34e1228e9544a5c7d8cdde9491f54d017af" Mar 01 09:30:56 crc kubenswrapper[4792]: E0301 09:30:56.915723 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebc7b83a7e8bbf8a5ebc283245e6f34e1228e9544a5c7d8cdde9491f54d017af\": container with ID starting with ebc7b83a7e8bbf8a5ebc283245e6f34e1228e9544a5c7d8cdde9491f54d017af not found: ID does not exist" containerID="ebc7b83a7e8bbf8a5ebc283245e6f34e1228e9544a5c7d8cdde9491f54d017af" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.915800 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebc7b83a7e8bbf8a5ebc283245e6f34e1228e9544a5c7d8cdde9491f54d017af"} err="failed to get container status \"ebc7b83a7e8bbf8a5ebc283245e6f34e1228e9544a5c7d8cdde9491f54d017af\": rpc error: code = NotFound desc = could not find container \"ebc7b83a7e8bbf8a5ebc283245e6f34e1228e9544a5c7d8cdde9491f54d017af\": container with ID starting with ebc7b83a7e8bbf8a5ebc283245e6f34e1228e9544a5c7d8cdde9491f54d017af not found: ID does not exist" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.915857 4792 scope.go:117] "RemoveContainer" containerID="34964964d66ee80be4a7e87aa9e1e71eff65ac803c33a8d8654d01bf69a0f013" Mar 01 09:30:56 crc kubenswrapper[4792]: E0301 09:30:56.916244 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34964964d66ee80be4a7e87aa9e1e71eff65ac803c33a8d8654d01bf69a0f013\": container with ID starting with 34964964d66ee80be4a7e87aa9e1e71eff65ac803c33a8d8654d01bf69a0f013 not found: ID does not exist" containerID="34964964d66ee80be4a7e87aa9e1e71eff65ac803c33a8d8654d01bf69a0f013" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.916291 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34964964d66ee80be4a7e87aa9e1e71eff65ac803c33a8d8654d01bf69a0f013"} err="failed to get container status \"34964964d66ee80be4a7e87aa9e1e71eff65ac803c33a8d8654d01bf69a0f013\": rpc error: code = NotFound desc = could not find container \"34964964d66ee80be4a7e87aa9e1e71eff65ac803c33a8d8654d01bf69a0f013\": container with ID starting with 34964964d66ee80be4a7e87aa9e1e71eff65ac803c33a8d8654d01bf69a0f013 not found: ID does not exist" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.916319 4792 scope.go:117] "RemoveContainer" containerID="27961c30989977627bc62cffbc34bfcddc0bf83961913bcb7f03340031c18bf3" Mar 01 09:30:56 crc kubenswrapper[4792]: E0301 09:30:56.916774 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27961c30989977627bc62cffbc34bfcddc0bf83961913bcb7f03340031c18bf3\": container with ID starting with 27961c30989977627bc62cffbc34bfcddc0bf83961913bcb7f03340031c18bf3 not found: ID does not exist" containerID="27961c30989977627bc62cffbc34bfcddc0bf83961913bcb7f03340031c18bf3" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.916798 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27961c30989977627bc62cffbc34bfcddc0bf83961913bcb7f03340031c18bf3"} err="failed to get container status \"27961c30989977627bc62cffbc34bfcddc0bf83961913bcb7f03340031c18bf3\": rpc error: code = NotFound desc = could not find container \"27961c30989977627bc62cffbc34bfcddc0bf83961913bcb7f03340031c18bf3\": container with ID starting with 27961c30989977627bc62cffbc34bfcddc0bf83961913bcb7f03340031c18bf3 not found: ID does not exist" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.916811 4792 scope.go:117] "RemoveContainer" containerID="527b30377b6256714a4388f7e23bcabdd65f592f13d11e6b52ac05c9342bfd2a" Mar 01 09:30:56 crc kubenswrapper[4792]: E0301 09:30:56.917244 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"527b30377b6256714a4388f7e23bcabdd65f592f13d11e6b52ac05c9342bfd2a\": container with ID starting with 527b30377b6256714a4388f7e23bcabdd65f592f13d11e6b52ac05c9342bfd2a not found: ID does not exist" containerID="527b30377b6256714a4388f7e23bcabdd65f592f13d11e6b52ac05c9342bfd2a" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.917266 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"527b30377b6256714a4388f7e23bcabdd65f592f13d11e6b52ac05c9342bfd2a"} err="failed to get container status \"527b30377b6256714a4388f7e23bcabdd65f592f13d11e6b52ac05c9342bfd2a\": rpc error: code = NotFound desc = could not find container \"527b30377b6256714a4388f7e23bcabdd65f592f13d11e6b52ac05c9342bfd2a\": container with ID starting with 527b30377b6256714a4388f7e23bcabdd65f592f13d11e6b52ac05c9342bfd2a not found: ID does not exist" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.981166 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-scripts\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.981249 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4871267-e63c-4804-a404-869a0fdbd171-run-httpd\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.981282 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl6tg\" (UniqueName: \"kubernetes.io/projected/a4871267-e63c-4804-a404-869a0fdbd171-kube-api-access-cl6tg\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.981341 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-config-data\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.981371 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.981390 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4871267-e63c-4804-a404-869a0fdbd171-log-httpd\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.981840 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.982077 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4871267-e63c-4804-a404-869a0fdbd171-run-httpd\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.982179 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4871267-e63c-4804-a404-869a0fdbd171-log-httpd\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.982516 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.985342 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-scripts\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.986388 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-config-data\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:56 crc kubenswrapper[4792]: I0301 09:30:56.986435 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:57 crc kubenswrapper[4792]: I0301 09:30:57.000451 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:57 crc kubenswrapper[4792]: I0301 09:30:57.000892 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:57 crc kubenswrapper[4792]: I0301 09:30:57.007656 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl6tg\" (UniqueName: \"kubernetes.io/projected/a4871267-e63c-4804-a404-869a0fdbd171-kube-api-access-cl6tg\") pod \"ceilometer-0\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " pod="openstack/ceilometer-0" Mar 01 09:30:57 crc kubenswrapper[4792]: I0301 09:30:57.163696 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 09:30:57 crc kubenswrapper[4792]: I0301 09:30:57.418187 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29759876-6cc9-4695-b6a1-b0204c0eeefe" path="/var/lib/kubelet/pods/29759876-6cc9-4695-b6a1-b0204c0eeefe/volumes" Mar 01 09:30:57 crc kubenswrapper[4792]: I0301 09:30:57.601962 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 09:30:57 crc kubenswrapper[4792]: W0301 09:30:57.604466 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4871267_e63c_4804_a404_869a0fdbd171.slice/crio-5daa2a8c28823d6b2f08c8830868b8a9480988afbc9d6f7999eee6cf3c7e7ff9 WatchSource:0}: Error finding container 5daa2a8c28823d6b2f08c8830868b8a9480988afbc9d6f7999eee6cf3c7e7ff9: Status 404 returned error can't find the container with id 5daa2a8c28823d6b2f08c8830868b8a9480988afbc9d6f7999eee6cf3c7e7ff9 Mar 01 09:30:57 crc kubenswrapper[4792]: I0301 09:30:57.773657 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4871267-e63c-4804-a404-869a0fdbd171","Type":"ContainerStarted","Data":"5daa2a8c28823d6b2f08c8830868b8a9480988afbc9d6f7999eee6cf3c7e7ff9"} Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.238999 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.312495 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-logs\") pod \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\" (UID: \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\") " Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.312664 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcn88\" (UniqueName: \"kubernetes.io/projected/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-kube-api-access-lcn88\") pod \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\" (UID: \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\") " Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.312754 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-combined-ca-bundle\") pod \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\" (UID: \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\") " Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.312816 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-config-data\") pod \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\" (UID: \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\") " Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.313708 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-logs" (OuterVolumeSpecName: "logs") pod "a9e82dc0-29af-47c8-bbef-1fd4bb999ff5" (UID: "a9e82dc0-29af-47c8-bbef-1fd4bb999ff5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.321671 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-kube-api-access-lcn88" (OuterVolumeSpecName: "kube-api-access-lcn88") pod "a9e82dc0-29af-47c8-bbef-1fd4bb999ff5" (UID: "a9e82dc0-29af-47c8-bbef-1fd4bb999ff5"). InnerVolumeSpecName "kube-api-access-lcn88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:30:58 crc kubenswrapper[4792]: E0301 09:30:58.346024 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-config-data podName:a9e82dc0-29af-47c8-bbef-1fd4bb999ff5 nodeName:}" failed. No retries permitted until 2026-03-01 09:30:58.846001704 +0000 UTC m=+1388.087880901 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-config-data") pod "a9e82dc0-29af-47c8-bbef-1fd4bb999ff5" (UID: "a9e82dc0-29af-47c8-bbef-1fd4bb999ff5") : error deleting /var/lib/kubelet/pods/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5/volume-subpaths: remove /var/lib/kubelet/pods/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5/volume-subpaths: no such file or directory Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.349328 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9e82dc0-29af-47c8-bbef-1fd4bb999ff5" (UID: "a9e82dc0-29af-47c8-bbef-1fd4bb999ff5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.415415 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.415450 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-logs\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.415465 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcn88\" (UniqueName: \"kubernetes.io/projected/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-kube-api-access-lcn88\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.782168 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4871267-e63c-4804-a404-869a0fdbd171","Type":"ContainerStarted","Data":"3d636a3af1c36dd5782bf30fc9b5448d960eda4cadc3be15e2341f14ec6c7b14"} Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.784090 4792 generic.go:334] "Generic (PLEG): container finished" podID="a9e82dc0-29af-47c8-bbef-1fd4bb999ff5" containerID="59421f9fcdf393f55cf74f2eb4e16fc9e1ecf0f2e44e7dfd8974930450018cc7" exitCode=0 Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.784136 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5","Type":"ContainerDied","Data":"59421f9fcdf393f55cf74f2eb4e16fc9e1ecf0f2e44e7dfd8974930450018cc7"} Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.784157 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5","Type":"ContainerDied","Data":"89a1a9514a04ba2ef6114510db9082d4635321014208a8dffde8aafd68862a7c"} Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.784175 4792 scope.go:117] "RemoveContainer" containerID="59421f9fcdf393f55cf74f2eb4e16fc9e1ecf0f2e44e7dfd8974930450018cc7" Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.784309 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.883567 4792 scope.go:117] "RemoveContainer" containerID="967bfda4431903ec2dd2aafcf91861e2b4d545ee88e602fcc124a7870529bbdf" Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.931420 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-config-data\") pod \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\" (UID: \"a9e82dc0-29af-47c8-bbef-1fd4bb999ff5\") " Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.936631 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-config-data" (OuterVolumeSpecName: "config-data") pod "a9e82dc0-29af-47c8-bbef-1fd4bb999ff5" (UID: "a9e82dc0-29af-47c8-bbef-1fd4bb999ff5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.954166 4792 scope.go:117] "RemoveContainer" containerID="59421f9fcdf393f55cf74f2eb4e16fc9e1ecf0f2e44e7dfd8974930450018cc7" Mar 01 09:30:58 crc kubenswrapper[4792]: E0301 09:30:58.957592 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59421f9fcdf393f55cf74f2eb4e16fc9e1ecf0f2e44e7dfd8974930450018cc7\": container with ID starting with 59421f9fcdf393f55cf74f2eb4e16fc9e1ecf0f2e44e7dfd8974930450018cc7 not found: ID does not exist" containerID="59421f9fcdf393f55cf74f2eb4e16fc9e1ecf0f2e44e7dfd8974930450018cc7" Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.957660 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59421f9fcdf393f55cf74f2eb4e16fc9e1ecf0f2e44e7dfd8974930450018cc7"} err="failed to get container status \"59421f9fcdf393f55cf74f2eb4e16fc9e1ecf0f2e44e7dfd8974930450018cc7\": rpc error: code = NotFound desc = could not find container \"59421f9fcdf393f55cf74f2eb4e16fc9e1ecf0f2e44e7dfd8974930450018cc7\": container with ID starting with 59421f9fcdf393f55cf74f2eb4e16fc9e1ecf0f2e44e7dfd8974930450018cc7 not found: ID does not exist" Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.957684 4792 scope.go:117] "RemoveContainer" containerID="967bfda4431903ec2dd2aafcf91861e2b4d545ee88e602fcc124a7870529bbdf" Mar 01 09:30:58 crc kubenswrapper[4792]: E0301 09:30:58.958283 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"967bfda4431903ec2dd2aafcf91861e2b4d545ee88e602fcc124a7870529bbdf\": container with ID starting with 967bfda4431903ec2dd2aafcf91861e2b4d545ee88e602fcc124a7870529bbdf not found: ID does not exist" containerID="967bfda4431903ec2dd2aafcf91861e2b4d545ee88e602fcc124a7870529bbdf" Mar 01 09:30:58 crc kubenswrapper[4792]: I0301 09:30:58.958311 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"967bfda4431903ec2dd2aafcf91861e2b4d545ee88e602fcc124a7870529bbdf"} err="failed to get container status \"967bfda4431903ec2dd2aafcf91861e2b4d545ee88e602fcc124a7870529bbdf\": rpc error: code = NotFound desc = could not find container \"967bfda4431903ec2dd2aafcf91861e2b4d545ee88e602fcc124a7870529bbdf\": container with ID starting with 967bfda4431903ec2dd2aafcf91861e2b4d545ee88e602fcc124a7870529bbdf not found: ID does not exist" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.036890 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.114478 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.123460 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.139576 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 01 09:30:59 crc kubenswrapper[4792]: E0301 09:30:59.140037 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9e82dc0-29af-47c8-bbef-1fd4bb999ff5" containerName="nova-api-api" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.140061 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9e82dc0-29af-47c8-bbef-1fd4bb999ff5" containerName="nova-api-api" Mar 01 09:30:59 crc kubenswrapper[4792]: E0301 09:30:59.140093 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9e82dc0-29af-47c8-bbef-1fd4bb999ff5" containerName="nova-api-log" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.140101 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9e82dc0-29af-47c8-bbef-1fd4bb999ff5" containerName="nova-api-log" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.140303 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9e82dc0-29af-47c8-bbef-1fd4bb999ff5" containerName="nova-api-log" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.140345 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9e82dc0-29af-47c8-bbef-1fd4bb999ff5" containerName="nova-api-api" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.141415 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.143771 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.145768 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.149557 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.154544 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.240700 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0641dd98-580b-48cf-87e8-4e0c891e18bd-logs\") pod \"nova-api-0\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.240836 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-config-data\") pod \"nova-api-0\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.240856 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-public-tls-certs\") pod \"nova-api-0\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.240980 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.241016 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtjdm\" (UniqueName: \"kubernetes.io/projected/0641dd98-580b-48cf-87e8-4e0c891e18bd-kube-api-access-dtjdm\") pod \"nova-api-0\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.241045 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.342098 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtjdm\" (UniqueName: \"kubernetes.io/projected/0641dd98-580b-48cf-87e8-4e0c891e18bd-kube-api-access-dtjdm\") pod \"nova-api-0\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.342177 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.342220 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0641dd98-580b-48cf-87e8-4e0c891e18bd-logs\") pod \"nova-api-0\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.342302 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-config-data\") pod \"nova-api-0\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.342351 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-public-tls-certs\") pod \"nova-api-0\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.342413 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.342998 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0641dd98-580b-48cf-87e8-4e0c891e18bd-logs\") pod \"nova-api-0\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.346447 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.347148 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-public-tls-certs\") pod \"nova-api-0\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.347221 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-config-data\") pod \"nova-api-0\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.356225 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.358276 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.362393 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtjdm\" (UniqueName: \"kubernetes.io/projected/0641dd98-580b-48cf-87e8-4e0c891e18bd-kube-api-access-dtjdm\") pod \"nova-api-0\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.384437 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.424056 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9e82dc0-29af-47c8-bbef-1fd4bb999ff5" path="/var/lib/kubelet/pods/a9e82dc0-29af-47c8-bbef-1fd4bb999ff5/volumes" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.456634 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.795080 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4871267-e63c-4804-a404-869a0fdbd171","Type":"ContainerStarted","Data":"9953bdd8b70e316fcb7e17da4271cb1872c378bfc503f331c38bf892ec73dc20"} Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.795361 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4871267-e63c-4804-a404-869a0fdbd171","Type":"ContainerStarted","Data":"61259115634a264180b7ef973ccc70cba261d421d03c9f32f31608df7c9afe57"} Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.812782 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 01 09:30:59 crc kubenswrapper[4792]: I0301 09:30:59.933106 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.017405 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-6q8nq"] Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.018474 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6q8nq" Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.020697 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-6q8nq"] Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.021891 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.022528 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.061069 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfbmw\" (UniqueName: \"kubernetes.io/projected/8782d670-70cd-42cc-b4d7-c0c8275e457b-kube-api-access-jfbmw\") pod \"nova-cell1-cell-mapping-6q8nq\" (UID: \"8782d670-70cd-42cc-b4d7-c0c8275e457b\") " pod="openstack/nova-cell1-cell-mapping-6q8nq" Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.061224 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8782d670-70cd-42cc-b4d7-c0c8275e457b-config-data\") pod \"nova-cell1-cell-mapping-6q8nq\" (UID: \"8782d670-70cd-42cc-b4d7-c0c8275e457b\") " pod="openstack/nova-cell1-cell-mapping-6q8nq" Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.061260 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8782d670-70cd-42cc-b4d7-c0c8275e457b-scripts\") pod \"nova-cell1-cell-mapping-6q8nq\" (UID: \"8782d670-70cd-42cc-b4d7-c0c8275e457b\") " pod="openstack/nova-cell1-cell-mapping-6q8nq" Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.061281 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8782d670-70cd-42cc-b4d7-c0c8275e457b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6q8nq\" (UID: \"8782d670-70cd-42cc-b4d7-c0c8275e457b\") " pod="openstack/nova-cell1-cell-mapping-6q8nq" Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.162718 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8782d670-70cd-42cc-b4d7-c0c8275e457b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6q8nq\" (UID: \"8782d670-70cd-42cc-b4d7-c0c8275e457b\") " pod="openstack/nova-cell1-cell-mapping-6q8nq" Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.163170 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfbmw\" (UniqueName: \"kubernetes.io/projected/8782d670-70cd-42cc-b4d7-c0c8275e457b-kube-api-access-jfbmw\") pod \"nova-cell1-cell-mapping-6q8nq\" (UID: \"8782d670-70cd-42cc-b4d7-c0c8275e457b\") " pod="openstack/nova-cell1-cell-mapping-6q8nq" Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.163307 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8782d670-70cd-42cc-b4d7-c0c8275e457b-config-data\") pod \"nova-cell1-cell-mapping-6q8nq\" (UID: \"8782d670-70cd-42cc-b4d7-c0c8275e457b\") " pod="openstack/nova-cell1-cell-mapping-6q8nq" Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.163340 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8782d670-70cd-42cc-b4d7-c0c8275e457b-scripts\") pod \"nova-cell1-cell-mapping-6q8nq\" (UID: \"8782d670-70cd-42cc-b4d7-c0c8275e457b\") " pod="openstack/nova-cell1-cell-mapping-6q8nq" Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.166191 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8782d670-70cd-42cc-b4d7-c0c8275e457b-scripts\") pod \"nova-cell1-cell-mapping-6q8nq\" (UID: \"8782d670-70cd-42cc-b4d7-c0c8275e457b\") " pod="openstack/nova-cell1-cell-mapping-6q8nq" Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.166239 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8782d670-70cd-42cc-b4d7-c0c8275e457b-config-data\") pod \"nova-cell1-cell-mapping-6q8nq\" (UID: \"8782d670-70cd-42cc-b4d7-c0c8275e457b\") " pod="openstack/nova-cell1-cell-mapping-6q8nq" Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.167591 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8782d670-70cd-42cc-b4d7-c0c8275e457b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6q8nq\" (UID: \"8782d670-70cd-42cc-b4d7-c0c8275e457b\") " pod="openstack/nova-cell1-cell-mapping-6q8nq" Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.187682 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfbmw\" (UniqueName: \"kubernetes.io/projected/8782d670-70cd-42cc-b4d7-c0c8275e457b-kube-api-access-jfbmw\") pod \"nova-cell1-cell-mapping-6q8nq\" (UID: \"8782d670-70cd-42cc-b4d7-c0c8275e457b\") " pod="openstack/nova-cell1-cell-mapping-6q8nq" Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.388286 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6q8nq" Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.805352 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0641dd98-580b-48cf-87e8-4e0c891e18bd","Type":"ContainerStarted","Data":"14d9113367f02407293d9d8ec1d10c56235ad48cabbbe3989c80dba6f7114608"} Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.805403 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0641dd98-580b-48cf-87e8-4e0c891e18bd","Type":"ContainerStarted","Data":"b775ffba9d9cb2b16b5f1e635b93b908a1badacd8016ff7bb4d2a11a1bd09ef9"} Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.805417 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0641dd98-580b-48cf-87e8-4e0c891e18bd","Type":"ContainerStarted","Data":"c1554809fd8548d7b1bbefcc4dff40233ceca12e7183c20dd377bc2e84b15644"} Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.826730 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.8267148770000001 podStartE2EDuration="1.826714877s" podCreationTimestamp="2026-03-01 09:30:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:31:00.822831183 +0000 UTC m=+1390.064710390" watchObservedRunningTime="2026-03-01 09:31:00.826714877 +0000 UTC m=+1390.068594074" Mar 01 09:31:00 crc kubenswrapper[4792]: I0301 09:31:00.878490 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-6q8nq"] Mar 01 09:31:01 crc kubenswrapper[4792]: I0301 09:31:01.815937 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4871267-e63c-4804-a404-869a0fdbd171","Type":"ContainerStarted","Data":"6a717ecdd9676dfa29b9bb443090712efe20cbd520284d6f5c7116f4cf10875d"} Mar 01 09:31:01 crc kubenswrapper[4792]: I0301 09:31:01.816567 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 01 09:31:01 crc kubenswrapper[4792]: I0301 09:31:01.819063 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6q8nq" event={"ID":"8782d670-70cd-42cc-b4d7-c0c8275e457b","Type":"ContainerStarted","Data":"c8dd8174a7f29b772205511318ce1d6a20b6f7ef7820849db345c8f1e46e0166"} Mar 01 09:31:01 crc kubenswrapper[4792]: I0301 09:31:01.819130 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6q8nq" event={"ID":"8782d670-70cd-42cc-b4d7-c0c8275e457b","Type":"ContainerStarted","Data":"2788d692590d5ea686a1f6514becf743d99bbd2329e64654d1e2c869653ccfb4"} Mar 01 09:31:01 crc kubenswrapper[4792]: I0301 09:31:01.845760 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.369367844 podStartE2EDuration="5.845737461s" podCreationTimestamp="2026-03-01 09:30:56 +0000 UTC" firstStartedPulling="2026-03-01 09:30:57.606781819 +0000 UTC m=+1386.848661016" lastFinishedPulling="2026-03-01 09:31:01.083151436 +0000 UTC m=+1390.325030633" observedRunningTime="2026-03-01 09:31:01.836860498 +0000 UTC m=+1391.078739695" watchObservedRunningTime="2026-03-01 09:31:01.845737461 +0000 UTC m=+1391.087616668" Mar 01 09:31:01 crc kubenswrapper[4792]: I0301 09:31:01.869226 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-6q8nq" podStartSLOduration=2.869203794 podStartE2EDuration="2.869203794s" podCreationTimestamp="2026-03-01 09:30:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:31:01.861507019 +0000 UTC m=+1391.103386256" watchObservedRunningTime="2026-03-01 09:31:01.869203794 +0000 UTC m=+1391.111083001" Mar 01 09:31:02 crc kubenswrapper[4792]: I0301 09:31:02.263939 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:31:02 crc kubenswrapper[4792]: I0301 09:31:02.319987 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw"] Mar 01 09:31:02 crc kubenswrapper[4792]: I0301 09:31:02.320282 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" podUID="dae0c901-5f9c-4248-96dd-08acb2b5d278" containerName="dnsmasq-dns" containerID="cri-o://36ef0d467aa78ea6675158c30915ca7e3bff42a879c627274dd90d0498f741fd" gracePeriod=10 Mar 01 09:31:02 crc kubenswrapper[4792]: I0301 09:31:02.848797 4792 generic.go:334] "Generic (PLEG): container finished" podID="dae0c901-5f9c-4248-96dd-08acb2b5d278" containerID="36ef0d467aa78ea6675158c30915ca7e3bff42a879c627274dd90d0498f741fd" exitCode=0 Mar 01 09:31:02 crc kubenswrapper[4792]: I0301 09:31:02.849386 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" event={"ID":"dae0c901-5f9c-4248-96dd-08acb2b5d278","Type":"ContainerDied","Data":"36ef0d467aa78ea6675158c30915ca7e3bff42a879c627274dd90d0498f741fd"} Mar 01 09:31:02 crc kubenswrapper[4792]: I0301 09:31:02.971979 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.028278 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-dns-svc\") pod \"dae0c901-5f9c-4248-96dd-08acb2b5d278\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.028360 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-config\") pod \"dae0c901-5f9c-4248-96dd-08acb2b5d278\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.028381 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-ovsdbserver-nb\") pod \"dae0c901-5f9c-4248-96dd-08acb2b5d278\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.028417 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhfgl\" (UniqueName: \"kubernetes.io/projected/dae0c901-5f9c-4248-96dd-08acb2b5d278-kube-api-access-rhfgl\") pod \"dae0c901-5f9c-4248-96dd-08acb2b5d278\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.028492 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-ovsdbserver-sb\") pod \"dae0c901-5f9c-4248-96dd-08acb2b5d278\" (UID: \"dae0c901-5f9c-4248-96dd-08acb2b5d278\") " Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.078956 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dae0c901-5f9c-4248-96dd-08acb2b5d278-kube-api-access-rhfgl" (OuterVolumeSpecName: "kube-api-access-rhfgl") pod "dae0c901-5f9c-4248-96dd-08acb2b5d278" (UID: "dae0c901-5f9c-4248-96dd-08acb2b5d278"). InnerVolumeSpecName "kube-api-access-rhfgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.126063 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-config" (OuterVolumeSpecName: "config") pod "dae0c901-5f9c-4248-96dd-08acb2b5d278" (UID: "dae0c901-5f9c-4248-96dd-08acb2b5d278"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.131604 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.131787 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhfgl\" (UniqueName: \"kubernetes.io/projected/dae0c901-5f9c-4248-96dd-08acb2b5d278-kube-api-access-rhfgl\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.147809 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dae0c901-5f9c-4248-96dd-08acb2b5d278" (UID: "dae0c901-5f9c-4248-96dd-08acb2b5d278"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.154407 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dae0c901-5f9c-4248-96dd-08acb2b5d278" (UID: "dae0c901-5f9c-4248-96dd-08acb2b5d278"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.170425 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dae0c901-5f9c-4248-96dd-08acb2b5d278" (UID: "dae0c901-5f9c-4248-96dd-08acb2b5d278"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.233006 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.233039 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.233050 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dae0c901-5f9c-4248-96dd-08acb2b5d278-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.859873 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" event={"ID":"dae0c901-5f9c-4248-96dd-08acb2b5d278","Type":"ContainerDied","Data":"c9bb9d1c4be64e9583f3d8ffde433440cf4e39ab72b4447ef9f1e98ba96850ca"} Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.859961 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.860193 4792 scope.go:117] "RemoveContainer" containerID="36ef0d467aa78ea6675158c30915ca7e3bff42a879c627274dd90d0498f741fd" Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.910453 4792 scope.go:117] "RemoveContainer" containerID="4f80ddb9167cc5bebd3ccdc43bf19f478b728967aa30e43898dc24927a2246f9" Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.934548 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw"] Mar 01 09:31:03 crc kubenswrapper[4792]: I0301 09:31:03.942743 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw"] Mar 01 09:31:05 crc kubenswrapper[4792]: I0301 09:31:05.420611 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dae0c901-5f9c-4248-96dd-08acb2b5d278" path="/var/lib/kubelet/pods/dae0c901-5f9c-4248-96dd-08acb2b5d278/volumes" Mar 01 09:31:06 crc kubenswrapper[4792]: I0301 09:31:06.890010 4792 generic.go:334] "Generic (PLEG): container finished" podID="8782d670-70cd-42cc-b4d7-c0c8275e457b" containerID="c8dd8174a7f29b772205511318ce1d6a20b6f7ef7820849db345c8f1e46e0166" exitCode=0 Mar 01 09:31:06 crc kubenswrapper[4792]: I0301 09:31:06.890089 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6q8nq" event={"ID":"8782d670-70cd-42cc-b4d7-c0c8275e457b","Type":"ContainerDied","Data":"c8dd8174a7f29b772205511318ce1d6a20b6f7ef7820849db345c8f1e46e0166"} Mar 01 09:31:07 crc kubenswrapper[4792]: I0301 09:31:07.714782 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7ff5b4cd7c-d6nmw" podUID="dae0c901-5f9c-4248-96dd-08acb2b5d278" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.181:5353: i/o timeout" Mar 01 09:31:08 crc kubenswrapper[4792]: I0301 09:31:08.248094 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6q8nq" Mar 01 09:31:08 crc kubenswrapper[4792]: I0301 09:31:08.323523 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8782d670-70cd-42cc-b4d7-c0c8275e457b-config-data\") pod \"8782d670-70cd-42cc-b4d7-c0c8275e457b\" (UID: \"8782d670-70cd-42cc-b4d7-c0c8275e457b\") " Mar 01 09:31:08 crc kubenswrapper[4792]: I0301 09:31:08.323659 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8782d670-70cd-42cc-b4d7-c0c8275e457b-combined-ca-bundle\") pod \"8782d670-70cd-42cc-b4d7-c0c8275e457b\" (UID: \"8782d670-70cd-42cc-b4d7-c0c8275e457b\") " Mar 01 09:31:08 crc kubenswrapper[4792]: I0301 09:31:08.323720 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8782d670-70cd-42cc-b4d7-c0c8275e457b-scripts\") pod \"8782d670-70cd-42cc-b4d7-c0c8275e457b\" (UID: \"8782d670-70cd-42cc-b4d7-c0c8275e457b\") " Mar 01 09:31:08 crc kubenswrapper[4792]: I0301 09:31:08.323837 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfbmw\" (UniqueName: \"kubernetes.io/projected/8782d670-70cd-42cc-b4d7-c0c8275e457b-kube-api-access-jfbmw\") pod \"8782d670-70cd-42cc-b4d7-c0c8275e457b\" (UID: \"8782d670-70cd-42cc-b4d7-c0c8275e457b\") " Mar 01 09:31:08 crc kubenswrapper[4792]: I0301 09:31:08.330172 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8782d670-70cd-42cc-b4d7-c0c8275e457b-scripts" (OuterVolumeSpecName: "scripts") pod "8782d670-70cd-42cc-b4d7-c0c8275e457b" (UID: "8782d670-70cd-42cc-b4d7-c0c8275e457b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:31:08 crc kubenswrapper[4792]: I0301 09:31:08.333338 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8782d670-70cd-42cc-b4d7-c0c8275e457b-kube-api-access-jfbmw" (OuterVolumeSpecName: "kube-api-access-jfbmw") pod "8782d670-70cd-42cc-b4d7-c0c8275e457b" (UID: "8782d670-70cd-42cc-b4d7-c0c8275e457b"). InnerVolumeSpecName "kube-api-access-jfbmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:31:08 crc kubenswrapper[4792]: I0301 09:31:08.352870 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8782d670-70cd-42cc-b4d7-c0c8275e457b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8782d670-70cd-42cc-b4d7-c0c8275e457b" (UID: "8782d670-70cd-42cc-b4d7-c0c8275e457b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:31:08 crc kubenswrapper[4792]: I0301 09:31:08.362780 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8782d670-70cd-42cc-b4d7-c0c8275e457b-config-data" (OuterVolumeSpecName: "config-data") pod "8782d670-70cd-42cc-b4d7-c0c8275e457b" (UID: "8782d670-70cd-42cc-b4d7-c0c8275e457b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:31:08 crc kubenswrapper[4792]: I0301 09:31:08.426229 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8782d670-70cd-42cc-b4d7-c0c8275e457b-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:08 crc kubenswrapper[4792]: I0301 09:31:08.426265 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfbmw\" (UniqueName: \"kubernetes.io/projected/8782d670-70cd-42cc-b4d7-c0c8275e457b-kube-api-access-jfbmw\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:08 crc kubenswrapper[4792]: I0301 09:31:08.426279 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8782d670-70cd-42cc-b4d7-c0c8275e457b-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:08 crc kubenswrapper[4792]: I0301 09:31:08.426294 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8782d670-70cd-42cc-b4d7-c0c8275e457b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:08 crc kubenswrapper[4792]: I0301 09:31:08.909679 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6q8nq" event={"ID":"8782d670-70cd-42cc-b4d7-c0c8275e457b","Type":"ContainerDied","Data":"2788d692590d5ea686a1f6514becf743d99bbd2329e64654d1e2c869653ccfb4"} Mar 01 09:31:08 crc kubenswrapper[4792]: I0301 09:31:08.910145 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2788d692590d5ea686a1f6514becf743d99bbd2329e64654d1e2c869653ccfb4" Mar 01 09:31:08 crc kubenswrapper[4792]: I0301 09:31:08.910275 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6q8nq" Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.100437 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.100692 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="4b5702fe-b26d-43ee-b702-4ac5527947cd" containerName="nova-scheduler-scheduler" containerID="cri-o://1130ef58928c6df8139553b0ba2bbd03e66b69c09b50f9867cc8cd0a70ce6c1e" gracePeriod=30 Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.113566 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.114180 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0641dd98-580b-48cf-87e8-4e0c891e18bd" containerName="nova-api-log" containerID="cri-o://b775ffba9d9cb2b16b5f1e635b93b908a1badacd8016ff7bb4d2a11a1bd09ef9" gracePeriod=30 Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.114208 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0641dd98-580b-48cf-87e8-4e0c891e18bd" containerName="nova-api-api" containerID="cri-o://14d9113367f02407293d9d8ec1d10c56235ad48cabbbe3989c80dba6f7114608" gracePeriod=30 Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.197531 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.197767 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="66b40740-5f2c-4f3a-9d20-3307335829ed" containerName="nova-metadata-log" containerID="cri-o://4824e5f7f0b4542375429e0324e26b504dd03e3e54b0250d4667215c3a5784ed" gracePeriod=30 Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.197996 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="66b40740-5f2c-4f3a-9d20-3307335829ed" containerName="nova-metadata-metadata" containerID="cri-o://a14ebd244facee732540cda9c7f5ef2f0b1a5f3035c360a40a8a1d517d895ef8" gracePeriod=30 Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.693799 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.747797 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-public-tls-certs\") pod \"0641dd98-580b-48cf-87e8-4e0c891e18bd\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.748096 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-internal-tls-certs\") pod \"0641dd98-580b-48cf-87e8-4e0c891e18bd\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.748225 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0641dd98-580b-48cf-87e8-4e0c891e18bd-logs\") pod \"0641dd98-580b-48cf-87e8-4e0c891e18bd\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.748343 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-combined-ca-bundle\") pod \"0641dd98-580b-48cf-87e8-4e0c891e18bd\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.748438 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtjdm\" (UniqueName: \"kubernetes.io/projected/0641dd98-580b-48cf-87e8-4e0c891e18bd-kube-api-access-dtjdm\") pod \"0641dd98-580b-48cf-87e8-4e0c891e18bd\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.748543 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-config-data\") pod \"0641dd98-580b-48cf-87e8-4e0c891e18bd\" (UID: \"0641dd98-580b-48cf-87e8-4e0c891e18bd\") " Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.759857 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0641dd98-580b-48cf-87e8-4e0c891e18bd-logs" (OuterVolumeSpecName: "logs") pod "0641dd98-580b-48cf-87e8-4e0c891e18bd" (UID: "0641dd98-580b-48cf-87e8-4e0c891e18bd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.766081 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0641dd98-580b-48cf-87e8-4e0c891e18bd-kube-api-access-dtjdm" (OuterVolumeSpecName: "kube-api-access-dtjdm") pod "0641dd98-580b-48cf-87e8-4e0c891e18bd" (UID: "0641dd98-580b-48cf-87e8-4e0c891e18bd"). InnerVolumeSpecName "kube-api-access-dtjdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.815074 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-config-data" (OuterVolumeSpecName: "config-data") pod "0641dd98-580b-48cf-87e8-4e0c891e18bd" (UID: "0641dd98-580b-48cf-87e8-4e0c891e18bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.830540 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0641dd98-580b-48cf-87e8-4e0c891e18bd" (UID: "0641dd98-580b-48cf-87e8-4e0c891e18bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.831709 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0641dd98-580b-48cf-87e8-4e0c891e18bd" (UID: "0641dd98-580b-48cf-87e8-4e0c891e18bd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.850746 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.850931 4792 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.851004 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0641dd98-580b-48cf-87e8-4e0c891e18bd-logs\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.851061 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.851121 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtjdm\" (UniqueName: \"kubernetes.io/projected/0641dd98-580b-48cf-87e8-4e0c891e18bd-kube-api-access-dtjdm\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.853351 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0641dd98-580b-48cf-87e8-4e0c891e18bd" (UID: "0641dd98-580b-48cf-87e8-4e0c891e18bd"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.925612 4792 generic.go:334] "Generic (PLEG): container finished" podID="0641dd98-580b-48cf-87e8-4e0c891e18bd" containerID="14d9113367f02407293d9d8ec1d10c56235ad48cabbbe3989c80dba6f7114608" exitCode=0 Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.925660 4792 generic.go:334] "Generic (PLEG): container finished" podID="0641dd98-580b-48cf-87e8-4e0c891e18bd" containerID="b775ffba9d9cb2b16b5f1e635b93b908a1badacd8016ff7bb4d2a11a1bd09ef9" exitCode=143 Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.925765 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.926016 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0641dd98-580b-48cf-87e8-4e0c891e18bd","Type":"ContainerDied","Data":"14d9113367f02407293d9d8ec1d10c56235ad48cabbbe3989c80dba6f7114608"} Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.926129 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0641dd98-580b-48cf-87e8-4e0c891e18bd","Type":"ContainerDied","Data":"b775ffba9d9cb2b16b5f1e635b93b908a1badacd8016ff7bb4d2a11a1bd09ef9"} Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.926203 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0641dd98-580b-48cf-87e8-4e0c891e18bd","Type":"ContainerDied","Data":"c1554809fd8548d7b1bbefcc4dff40233ceca12e7183c20dd377bc2e84b15644"} Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.926312 4792 scope.go:117] "RemoveContainer" containerID="14d9113367f02407293d9d8ec1d10c56235ad48cabbbe3989c80dba6f7114608" Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.928582 4792 generic.go:334] "Generic (PLEG): container finished" podID="4b5702fe-b26d-43ee-b702-4ac5527947cd" containerID="1130ef58928c6df8139553b0ba2bbd03e66b69c09b50f9867cc8cd0a70ce6c1e" exitCode=0 Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.928643 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4b5702fe-b26d-43ee-b702-4ac5527947cd","Type":"ContainerDied","Data":"1130ef58928c6df8139553b0ba2bbd03e66b69c09b50f9867cc8cd0a70ce6c1e"} Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.940205 4792 generic.go:334] "Generic (PLEG): container finished" podID="66b40740-5f2c-4f3a-9d20-3307335829ed" containerID="4824e5f7f0b4542375429e0324e26b504dd03e3e54b0250d4667215c3a5784ed" exitCode=143 Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.940267 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"66b40740-5f2c-4f3a-9d20-3307335829ed","Type":"ContainerDied","Data":"4824e5f7f0b4542375429e0324e26b504dd03e3e54b0250d4667215c3a5784ed"} Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.953068 4792 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0641dd98-580b-48cf-87e8-4e0c891e18bd-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.962931 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.979020 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 01 09:31:09 crc kubenswrapper[4792]: I0301 09:31:09.992455 4792 scope.go:117] "RemoveContainer" containerID="b775ffba9d9cb2b16b5f1e635b93b908a1badacd8016ff7bb4d2a11a1bd09ef9" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.022273 4792 scope.go:117] "RemoveContainer" containerID="14d9113367f02407293d9d8ec1d10c56235ad48cabbbe3989c80dba6f7114608" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.022502 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 01 09:31:10 crc kubenswrapper[4792]: E0301 09:31:10.022930 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dae0c901-5f9c-4248-96dd-08acb2b5d278" containerName="dnsmasq-dns" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.023014 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="dae0c901-5f9c-4248-96dd-08acb2b5d278" containerName="dnsmasq-dns" Mar 01 09:31:10 crc kubenswrapper[4792]: E0301 09:31:10.023074 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0641dd98-580b-48cf-87e8-4e0c891e18bd" containerName="nova-api-log" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.023139 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0641dd98-580b-48cf-87e8-4e0c891e18bd" containerName="nova-api-log" Mar 01 09:31:10 crc kubenswrapper[4792]: E0301 09:31:10.023195 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8782d670-70cd-42cc-b4d7-c0c8275e457b" containerName="nova-manage" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.023244 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8782d670-70cd-42cc-b4d7-c0c8275e457b" containerName="nova-manage" Mar 01 09:31:10 crc kubenswrapper[4792]: E0301 09:31:10.023314 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0641dd98-580b-48cf-87e8-4e0c891e18bd" containerName="nova-api-api" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.023365 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0641dd98-580b-48cf-87e8-4e0c891e18bd" containerName="nova-api-api" Mar 01 09:31:10 crc kubenswrapper[4792]: E0301 09:31:10.023425 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dae0c901-5f9c-4248-96dd-08acb2b5d278" containerName="init" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.023477 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="dae0c901-5f9c-4248-96dd-08acb2b5d278" containerName="init" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.023686 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="8782d670-70cd-42cc-b4d7-c0c8275e457b" containerName="nova-manage" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.023768 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="dae0c901-5f9c-4248-96dd-08acb2b5d278" containerName="dnsmasq-dns" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.023830 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0641dd98-580b-48cf-87e8-4e0c891e18bd" containerName="nova-api-api" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.023884 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0641dd98-580b-48cf-87e8-4e0c891e18bd" containerName="nova-api-log" Mar 01 09:31:10 crc kubenswrapper[4792]: E0301 09:31:10.024249 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14d9113367f02407293d9d8ec1d10c56235ad48cabbbe3989c80dba6f7114608\": container with ID starting with 14d9113367f02407293d9d8ec1d10c56235ad48cabbbe3989c80dba6f7114608 not found: ID does not exist" containerID="14d9113367f02407293d9d8ec1d10c56235ad48cabbbe3989c80dba6f7114608" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.024280 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14d9113367f02407293d9d8ec1d10c56235ad48cabbbe3989c80dba6f7114608"} err="failed to get container status \"14d9113367f02407293d9d8ec1d10c56235ad48cabbbe3989c80dba6f7114608\": rpc error: code = NotFound desc = could not find container \"14d9113367f02407293d9d8ec1d10c56235ad48cabbbe3989c80dba6f7114608\": container with ID starting with 14d9113367f02407293d9d8ec1d10c56235ad48cabbbe3989c80dba6f7114608 not found: ID does not exist" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.024301 4792 scope.go:117] "RemoveContainer" containerID="b775ffba9d9cb2b16b5f1e635b93b908a1badacd8016ff7bb4d2a11a1bd09ef9" Mar 01 09:31:10 crc kubenswrapper[4792]: E0301 09:31:10.024519 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b775ffba9d9cb2b16b5f1e635b93b908a1badacd8016ff7bb4d2a11a1bd09ef9\": container with ID starting with b775ffba9d9cb2b16b5f1e635b93b908a1badacd8016ff7bb4d2a11a1bd09ef9 not found: ID does not exist" containerID="b775ffba9d9cb2b16b5f1e635b93b908a1badacd8016ff7bb4d2a11a1bd09ef9" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.024535 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b775ffba9d9cb2b16b5f1e635b93b908a1badacd8016ff7bb4d2a11a1bd09ef9"} err="failed to get container status \"b775ffba9d9cb2b16b5f1e635b93b908a1badacd8016ff7bb4d2a11a1bd09ef9\": rpc error: code = NotFound desc = could not find container \"b775ffba9d9cb2b16b5f1e635b93b908a1badacd8016ff7bb4d2a11a1bd09ef9\": container with ID starting with b775ffba9d9cb2b16b5f1e635b93b908a1badacd8016ff7bb4d2a11a1bd09ef9 not found: ID does not exist" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.024547 4792 scope.go:117] "RemoveContainer" containerID="14d9113367f02407293d9d8ec1d10c56235ad48cabbbe3989c80dba6f7114608" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.024707 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14d9113367f02407293d9d8ec1d10c56235ad48cabbbe3989c80dba6f7114608"} err="failed to get container status \"14d9113367f02407293d9d8ec1d10c56235ad48cabbbe3989c80dba6f7114608\": rpc error: code = NotFound desc = could not find container \"14d9113367f02407293d9d8ec1d10c56235ad48cabbbe3989c80dba6f7114608\": container with ID starting with 14d9113367f02407293d9d8ec1d10c56235ad48cabbbe3989c80dba6f7114608 not found: ID does not exist" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.024721 4792 scope.go:117] "RemoveContainer" containerID="b775ffba9d9cb2b16b5f1e635b93b908a1badacd8016ff7bb4d2a11a1bd09ef9" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.025257 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b775ffba9d9cb2b16b5f1e635b93b908a1badacd8016ff7bb4d2a11a1bd09ef9"} err="failed to get container status \"b775ffba9d9cb2b16b5f1e635b93b908a1badacd8016ff7bb4d2a11a1bd09ef9\": rpc error: code = NotFound desc = could not find container \"b775ffba9d9cb2b16b5f1e635b93b908a1badacd8016ff7bb4d2a11a1bd09ef9\": container with ID starting with b775ffba9d9cb2b16b5f1e635b93b908a1badacd8016ff7bb4d2a11a1bd09ef9 not found: ID does not exist" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.029303 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.032919 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.034799 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.034799 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.035439 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.055946 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c6de822-b7f5-4530-bb5b-ca879ff899fc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9c6de822-b7f5-4530-bb5b-ca879ff899fc\") " pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.055996 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c6de822-b7f5-4530-bb5b-ca879ff899fc-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9c6de822-b7f5-4530-bb5b-ca879ff899fc\") " pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.056019 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c6de822-b7f5-4530-bb5b-ca879ff899fc-logs\") pod \"nova-api-0\" (UID: \"9c6de822-b7f5-4530-bb5b-ca879ff899fc\") " pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.056132 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c6de822-b7f5-4530-bb5b-ca879ff899fc-public-tls-certs\") pod \"nova-api-0\" (UID: \"9c6de822-b7f5-4530-bb5b-ca879ff899fc\") " pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.056179 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c6de822-b7f5-4530-bb5b-ca879ff899fc-config-data\") pod \"nova-api-0\" (UID: \"9c6de822-b7f5-4530-bb5b-ca879ff899fc\") " pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.056199 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpt2c\" (UniqueName: \"kubernetes.io/projected/9c6de822-b7f5-4530-bb5b-ca879ff899fc-kube-api-access-dpt2c\") pod \"nova-api-0\" (UID: \"9c6de822-b7f5-4530-bb5b-ca879ff899fc\") " pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.158480 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c6de822-b7f5-4530-bb5b-ca879ff899fc-public-tls-certs\") pod \"nova-api-0\" (UID: \"9c6de822-b7f5-4530-bb5b-ca879ff899fc\") " pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.158734 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c6de822-b7f5-4530-bb5b-ca879ff899fc-config-data\") pod \"nova-api-0\" (UID: \"9c6de822-b7f5-4530-bb5b-ca879ff899fc\") " pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.158760 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpt2c\" (UniqueName: \"kubernetes.io/projected/9c6de822-b7f5-4530-bb5b-ca879ff899fc-kube-api-access-dpt2c\") pod \"nova-api-0\" (UID: \"9c6de822-b7f5-4530-bb5b-ca879ff899fc\") " pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.158823 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c6de822-b7f5-4530-bb5b-ca879ff899fc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9c6de822-b7f5-4530-bb5b-ca879ff899fc\") " pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.158862 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c6de822-b7f5-4530-bb5b-ca879ff899fc-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9c6de822-b7f5-4530-bb5b-ca879ff899fc\") " pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.158884 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c6de822-b7f5-4530-bb5b-ca879ff899fc-logs\") pod \"nova-api-0\" (UID: \"9c6de822-b7f5-4530-bb5b-ca879ff899fc\") " pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.159376 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c6de822-b7f5-4530-bb5b-ca879ff899fc-logs\") pod \"nova-api-0\" (UID: \"9c6de822-b7f5-4530-bb5b-ca879ff899fc\") " pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.167931 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c6de822-b7f5-4530-bb5b-ca879ff899fc-config-data\") pod \"nova-api-0\" (UID: \"9c6de822-b7f5-4530-bb5b-ca879ff899fc\") " pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.169802 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c6de822-b7f5-4530-bb5b-ca879ff899fc-public-tls-certs\") pod \"nova-api-0\" (UID: \"9c6de822-b7f5-4530-bb5b-ca879ff899fc\") " pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.170201 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c6de822-b7f5-4530-bb5b-ca879ff899fc-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9c6de822-b7f5-4530-bb5b-ca879ff899fc\") " pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.174095 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c6de822-b7f5-4530-bb5b-ca879ff899fc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9c6de822-b7f5-4530-bb5b-ca879ff899fc\") " pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.181446 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpt2c\" (UniqueName: \"kubernetes.io/projected/9c6de822-b7f5-4530-bb5b-ca879ff899fc-kube-api-access-dpt2c\") pod \"nova-api-0\" (UID: \"9c6de822-b7f5-4530-bb5b-ca879ff899fc\") " pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.362524 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.415677 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.569945 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4tl5\" (UniqueName: \"kubernetes.io/projected/4b5702fe-b26d-43ee-b702-4ac5527947cd-kube-api-access-k4tl5\") pod \"4b5702fe-b26d-43ee-b702-4ac5527947cd\" (UID: \"4b5702fe-b26d-43ee-b702-4ac5527947cd\") " Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.570024 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b5702fe-b26d-43ee-b702-4ac5527947cd-config-data\") pod \"4b5702fe-b26d-43ee-b702-4ac5527947cd\" (UID: \"4b5702fe-b26d-43ee-b702-4ac5527947cd\") " Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.570050 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b5702fe-b26d-43ee-b702-4ac5527947cd-combined-ca-bundle\") pod \"4b5702fe-b26d-43ee-b702-4ac5527947cd\" (UID: \"4b5702fe-b26d-43ee-b702-4ac5527947cd\") " Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.581103 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b5702fe-b26d-43ee-b702-4ac5527947cd-kube-api-access-k4tl5" (OuterVolumeSpecName: "kube-api-access-k4tl5") pod "4b5702fe-b26d-43ee-b702-4ac5527947cd" (UID: "4b5702fe-b26d-43ee-b702-4ac5527947cd"). InnerVolumeSpecName "kube-api-access-k4tl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.608929 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b5702fe-b26d-43ee-b702-4ac5527947cd-config-data" (OuterVolumeSpecName: "config-data") pod "4b5702fe-b26d-43ee-b702-4ac5527947cd" (UID: "4b5702fe-b26d-43ee-b702-4ac5527947cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.610121 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b5702fe-b26d-43ee-b702-4ac5527947cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b5702fe-b26d-43ee-b702-4ac5527947cd" (UID: "4b5702fe-b26d-43ee-b702-4ac5527947cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.675160 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4tl5\" (UniqueName: \"kubernetes.io/projected/4b5702fe-b26d-43ee-b702-4ac5527947cd-kube-api-access-k4tl5\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.675403 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b5702fe-b26d-43ee-b702-4ac5527947cd-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.675414 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b5702fe-b26d-43ee-b702-4ac5527947cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.957182 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4b5702fe-b26d-43ee-b702-4ac5527947cd","Type":"ContainerDied","Data":"100c00aaeb772803e831be7330ba512b669313e4b7b45f0161f7576070d6c332"} Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.957237 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.957239 4792 scope.go:117] "RemoveContainer" containerID="1130ef58928c6df8139553b0ba2bbd03e66b69c09b50f9867cc8cd0a70ce6c1e" Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.991518 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 01 09:31:10 crc kubenswrapper[4792]: I0301 09:31:10.998992 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.016170 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 01 09:31:11 crc kubenswrapper[4792]: E0301 09:31:11.016553 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b5702fe-b26d-43ee-b702-4ac5527947cd" containerName="nova-scheduler-scheduler" Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.016566 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b5702fe-b26d-43ee-b702-4ac5527947cd" containerName="nova-scheduler-scheduler" Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.016754 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b5702fe-b26d-43ee-b702-4ac5527947cd" containerName="nova-scheduler-scheduler" Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.017332 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.029080 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.048708 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.084371 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a38c1a1-88bc-4bce-aea4-13e676aab111-config-data\") pod \"nova-scheduler-0\" (UID: \"3a38c1a1-88bc-4bce-aea4-13e676aab111\") " pod="openstack/nova-scheduler-0" Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.084516 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5dn4\" (UniqueName: \"kubernetes.io/projected/3a38c1a1-88bc-4bce-aea4-13e676aab111-kube-api-access-b5dn4\") pod \"nova-scheduler-0\" (UID: \"3a38c1a1-88bc-4bce-aea4-13e676aab111\") " pod="openstack/nova-scheduler-0" Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.084556 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a38c1a1-88bc-4bce-aea4-13e676aab111-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3a38c1a1-88bc-4bce-aea4-13e676aab111\") " pod="openstack/nova-scheduler-0" Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.102717 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.186067 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a38c1a1-88bc-4bce-aea4-13e676aab111-config-data\") pod \"nova-scheduler-0\" (UID: \"3a38c1a1-88bc-4bce-aea4-13e676aab111\") " pod="openstack/nova-scheduler-0" Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.186179 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5dn4\" (UniqueName: \"kubernetes.io/projected/3a38c1a1-88bc-4bce-aea4-13e676aab111-kube-api-access-b5dn4\") pod \"nova-scheduler-0\" (UID: \"3a38c1a1-88bc-4bce-aea4-13e676aab111\") " pod="openstack/nova-scheduler-0" Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.186208 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a38c1a1-88bc-4bce-aea4-13e676aab111-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3a38c1a1-88bc-4bce-aea4-13e676aab111\") " pod="openstack/nova-scheduler-0" Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.190388 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a38c1a1-88bc-4bce-aea4-13e676aab111-config-data\") pod \"nova-scheduler-0\" (UID: \"3a38c1a1-88bc-4bce-aea4-13e676aab111\") " pod="openstack/nova-scheduler-0" Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.190707 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a38c1a1-88bc-4bce-aea4-13e676aab111-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3a38c1a1-88bc-4bce-aea4-13e676aab111\") " pod="openstack/nova-scheduler-0" Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.202054 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5dn4\" (UniqueName: \"kubernetes.io/projected/3a38c1a1-88bc-4bce-aea4-13e676aab111-kube-api-access-b5dn4\") pod \"nova-scheduler-0\" (UID: \"3a38c1a1-88bc-4bce-aea4-13e676aab111\") " pod="openstack/nova-scheduler-0" Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.333809 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.436583 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0641dd98-580b-48cf-87e8-4e0c891e18bd" path="/var/lib/kubelet/pods/0641dd98-580b-48cf-87e8-4e0c891e18bd/volumes" Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.437724 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b5702fe-b26d-43ee-b702-4ac5527947cd" path="/var/lib/kubelet/pods/4b5702fe-b26d-43ee-b702-4ac5527947cd/volumes" Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.794948 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.968939 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3a38c1a1-88bc-4bce-aea4-13e676aab111","Type":"ContainerStarted","Data":"f7bebc52f2b39ad43604457deeb01e7bc04c34789e04adf6c5dbee2cda5995b7"} Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.970019 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3a38c1a1-88bc-4bce-aea4-13e676aab111","Type":"ContainerStarted","Data":"4c3ff6612499c7c31496ed0b4566e6edddcc71e37f473a254a4a649a94b6bdb0"} Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.972421 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9c6de822-b7f5-4530-bb5b-ca879ff899fc","Type":"ContainerStarted","Data":"9b6cf9899e3d0346caebae9b1112afc79f6f3651efb183116d398b187fb43516"} Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.972460 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9c6de822-b7f5-4530-bb5b-ca879ff899fc","Type":"ContainerStarted","Data":"7491a57576f3b0a70db6735da990c539632ed6193f106bb5da1431b74de073b5"} Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.972471 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9c6de822-b7f5-4530-bb5b-ca879ff899fc","Type":"ContainerStarted","Data":"ba581e26517112c0e6879caacc25eab02457091162faddb066689dd4c874ddef"} Mar 01 09:31:11 crc kubenswrapper[4792]: I0301 09:31:11.983098 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.9830791140000001 podStartE2EDuration="1.983079114s" podCreationTimestamp="2026-03-01 09:31:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:31:11.981450665 +0000 UTC m=+1401.223329872" watchObservedRunningTime="2026-03-01 09:31:11.983079114 +0000 UTC m=+1401.224958311" Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.004477 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.004463206 podStartE2EDuration="3.004463206s" podCreationTimestamp="2026-03-01 09:31:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:31:12.000710076 +0000 UTC m=+1401.242589283" watchObservedRunningTime="2026-03-01 09:31:12.004463206 +0000 UTC m=+1401.246342403" Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.337470 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="66b40740-5f2c-4f3a-9d20-3307335829ed" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.186:8775/\": read tcp 10.217.0.2:55656->10.217.0.186:8775: read: connection reset by peer" Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.338169 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="66b40740-5f2c-4f3a-9d20-3307335829ed" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.186:8775/\": read tcp 10.217.0.2:55670->10.217.0.186:8775: read: connection reset by peer" Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.736725 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.815549 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-nova-metadata-tls-certs\") pod \"66b40740-5f2c-4f3a-9d20-3307335829ed\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.815636 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-config-data\") pod \"66b40740-5f2c-4f3a-9d20-3307335829ed\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.815721 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66b40740-5f2c-4f3a-9d20-3307335829ed-logs\") pod \"66b40740-5f2c-4f3a-9d20-3307335829ed\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.815745 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqzd2\" (UniqueName: \"kubernetes.io/projected/66b40740-5f2c-4f3a-9d20-3307335829ed-kube-api-access-lqzd2\") pod \"66b40740-5f2c-4f3a-9d20-3307335829ed\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.815835 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-combined-ca-bundle\") pod \"66b40740-5f2c-4f3a-9d20-3307335829ed\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.816348 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66b40740-5f2c-4f3a-9d20-3307335829ed-logs" (OuterVolumeSpecName: "logs") pod "66b40740-5f2c-4f3a-9d20-3307335829ed" (UID: "66b40740-5f2c-4f3a-9d20-3307335829ed"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.837816 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66b40740-5f2c-4f3a-9d20-3307335829ed-kube-api-access-lqzd2" (OuterVolumeSpecName: "kube-api-access-lqzd2") pod "66b40740-5f2c-4f3a-9d20-3307335829ed" (UID: "66b40740-5f2c-4f3a-9d20-3307335829ed"). InnerVolumeSpecName "kube-api-access-lqzd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.867004 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66b40740-5f2c-4f3a-9d20-3307335829ed" (UID: "66b40740-5f2c-4f3a-9d20-3307335829ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:31:12 crc kubenswrapper[4792]: E0301 09:31:12.882280 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-config-data podName:66b40740-5f2c-4f3a-9d20-3307335829ed nodeName:}" failed. No retries permitted until 2026-03-01 09:31:13.382255214 +0000 UTC m=+1402.624134411 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-config-data") pod "66b40740-5f2c-4f3a-9d20-3307335829ed" (UID: "66b40740-5f2c-4f3a-9d20-3307335829ed") : error deleting /var/lib/kubelet/pods/66b40740-5f2c-4f3a-9d20-3307335829ed/volume-subpaths: remove /var/lib/kubelet/pods/66b40740-5f2c-4f3a-9d20-3307335829ed/volume-subpaths: no such file or directory Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.894185 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "66b40740-5f2c-4f3a-9d20-3307335829ed" (UID: "66b40740-5f2c-4f3a-9d20-3307335829ed"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.917844 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66b40740-5f2c-4f3a-9d20-3307335829ed-logs\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.918183 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqzd2\" (UniqueName: \"kubernetes.io/projected/66b40740-5f2c-4f3a-9d20-3307335829ed-kube-api-access-lqzd2\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.918288 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.918371 4792 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.983557 4792 generic.go:334] "Generic (PLEG): container finished" podID="66b40740-5f2c-4f3a-9d20-3307335829ed" containerID="a14ebd244facee732540cda9c7f5ef2f0b1a5f3035c360a40a8a1d517d895ef8" exitCode=0 Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.983606 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.983633 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"66b40740-5f2c-4f3a-9d20-3307335829ed","Type":"ContainerDied","Data":"a14ebd244facee732540cda9c7f5ef2f0b1a5f3035c360a40a8a1d517d895ef8"} Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.984378 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"66b40740-5f2c-4f3a-9d20-3307335829ed","Type":"ContainerDied","Data":"243858d3fba357536368a417ecf1a917b39cc53d20c017fb5d7083ee13365f6d"} Mar 01 09:31:12 crc kubenswrapper[4792]: I0301 09:31:12.984407 4792 scope.go:117] "RemoveContainer" containerID="a14ebd244facee732540cda9c7f5ef2f0b1a5f3035c360a40a8a1d517d895ef8" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.009137 4792 scope.go:117] "RemoveContainer" containerID="4824e5f7f0b4542375429e0324e26b504dd03e3e54b0250d4667215c3a5784ed" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.040380 4792 scope.go:117] "RemoveContainer" containerID="a14ebd244facee732540cda9c7f5ef2f0b1a5f3035c360a40a8a1d517d895ef8" Mar 01 09:31:13 crc kubenswrapper[4792]: E0301 09:31:13.040964 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a14ebd244facee732540cda9c7f5ef2f0b1a5f3035c360a40a8a1d517d895ef8\": container with ID starting with a14ebd244facee732540cda9c7f5ef2f0b1a5f3035c360a40a8a1d517d895ef8 not found: ID does not exist" containerID="a14ebd244facee732540cda9c7f5ef2f0b1a5f3035c360a40a8a1d517d895ef8" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.041072 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a14ebd244facee732540cda9c7f5ef2f0b1a5f3035c360a40a8a1d517d895ef8"} err="failed to get container status \"a14ebd244facee732540cda9c7f5ef2f0b1a5f3035c360a40a8a1d517d895ef8\": rpc error: code = NotFound desc = could not find container \"a14ebd244facee732540cda9c7f5ef2f0b1a5f3035c360a40a8a1d517d895ef8\": container with ID starting with a14ebd244facee732540cda9c7f5ef2f0b1a5f3035c360a40a8a1d517d895ef8 not found: ID does not exist" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.041178 4792 scope.go:117] "RemoveContainer" containerID="4824e5f7f0b4542375429e0324e26b504dd03e3e54b0250d4667215c3a5784ed" Mar 01 09:31:13 crc kubenswrapper[4792]: E0301 09:31:13.041533 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4824e5f7f0b4542375429e0324e26b504dd03e3e54b0250d4667215c3a5784ed\": container with ID starting with 4824e5f7f0b4542375429e0324e26b504dd03e3e54b0250d4667215c3a5784ed not found: ID does not exist" containerID="4824e5f7f0b4542375429e0324e26b504dd03e3e54b0250d4667215c3a5784ed" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.041580 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4824e5f7f0b4542375429e0324e26b504dd03e3e54b0250d4667215c3a5784ed"} err="failed to get container status \"4824e5f7f0b4542375429e0324e26b504dd03e3e54b0250d4667215c3a5784ed\": rpc error: code = NotFound desc = could not find container \"4824e5f7f0b4542375429e0324e26b504dd03e3e54b0250d4667215c3a5784ed\": container with ID starting with 4824e5f7f0b4542375429e0324e26b504dd03e3e54b0250d4667215c3a5784ed not found: ID does not exist" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.426466 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-config-data\") pod \"66b40740-5f2c-4f3a-9d20-3307335829ed\" (UID: \"66b40740-5f2c-4f3a-9d20-3307335829ed\") " Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.441320 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-config-data" (OuterVolumeSpecName: "config-data") pod "66b40740-5f2c-4f3a-9d20-3307335829ed" (UID: "66b40740-5f2c-4f3a-9d20-3307335829ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.530157 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66b40740-5f2c-4f3a-9d20-3307335829ed-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.625546 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.635649 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.653433 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:31:13 crc kubenswrapper[4792]: E0301 09:31:13.654440 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66b40740-5f2c-4f3a-9d20-3307335829ed" containerName="nova-metadata-log" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.654464 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="66b40740-5f2c-4f3a-9d20-3307335829ed" containerName="nova-metadata-log" Mar 01 09:31:13 crc kubenswrapper[4792]: E0301 09:31:13.654517 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66b40740-5f2c-4f3a-9d20-3307335829ed" containerName="nova-metadata-metadata" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.654525 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="66b40740-5f2c-4f3a-9d20-3307335829ed" containerName="nova-metadata-metadata" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.654802 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="66b40740-5f2c-4f3a-9d20-3307335829ed" containerName="nova-metadata-log" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.654843 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="66b40740-5f2c-4f3a-9d20-3307335829ed" containerName="nova-metadata-metadata" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.656534 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.662944 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.663753 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.681752 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.836060 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbf9560f-212f-460a-9a4d-250e20b00d18-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cbf9560f-212f-460a-9a4d-250e20b00d18\") " pod="openstack/nova-metadata-0" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.836410 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbf9560f-212f-460a-9a4d-250e20b00d18-config-data\") pod \"nova-metadata-0\" (UID: \"cbf9560f-212f-460a-9a4d-250e20b00d18\") " pod="openstack/nova-metadata-0" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.836641 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbf9560f-212f-460a-9a4d-250e20b00d18-logs\") pod \"nova-metadata-0\" (UID: \"cbf9560f-212f-460a-9a4d-250e20b00d18\") " pod="openstack/nova-metadata-0" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.836705 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l26tw\" (UniqueName: \"kubernetes.io/projected/cbf9560f-212f-460a-9a4d-250e20b00d18-kube-api-access-l26tw\") pod \"nova-metadata-0\" (UID: \"cbf9560f-212f-460a-9a4d-250e20b00d18\") " pod="openstack/nova-metadata-0" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.836829 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbf9560f-212f-460a-9a4d-250e20b00d18-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cbf9560f-212f-460a-9a4d-250e20b00d18\") " pod="openstack/nova-metadata-0" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.938811 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbf9560f-212f-460a-9a4d-250e20b00d18-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cbf9560f-212f-460a-9a4d-250e20b00d18\") " pod="openstack/nova-metadata-0" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.938883 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbf9560f-212f-460a-9a4d-250e20b00d18-config-data\") pod \"nova-metadata-0\" (UID: \"cbf9560f-212f-460a-9a4d-250e20b00d18\") " pod="openstack/nova-metadata-0" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.938944 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbf9560f-212f-460a-9a4d-250e20b00d18-logs\") pod \"nova-metadata-0\" (UID: \"cbf9560f-212f-460a-9a4d-250e20b00d18\") " pod="openstack/nova-metadata-0" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.938967 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l26tw\" (UniqueName: \"kubernetes.io/projected/cbf9560f-212f-460a-9a4d-250e20b00d18-kube-api-access-l26tw\") pod \"nova-metadata-0\" (UID: \"cbf9560f-212f-460a-9a4d-250e20b00d18\") " pod="openstack/nova-metadata-0" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.939012 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbf9560f-212f-460a-9a4d-250e20b00d18-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cbf9560f-212f-460a-9a4d-250e20b00d18\") " pod="openstack/nova-metadata-0" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.939947 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbf9560f-212f-460a-9a4d-250e20b00d18-logs\") pod \"nova-metadata-0\" (UID: \"cbf9560f-212f-460a-9a4d-250e20b00d18\") " pod="openstack/nova-metadata-0" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.944813 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbf9560f-212f-460a-9a4d-250e20b00d18-config-data\") pod \"nova-metadata-0\" (UID: \"cbf9560f-212f-460a-9a4d-250e20b00d18\") " pod="openstack/nova-metadata-0" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.947814 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbf9560f-212f-460a-9a4d-250e20b00d18-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cbf9560f-212f-460a-9a4d-250e20b00d18\") " pod="openstack/nova-metadata-0" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.969606 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbf9560f-212f-460a-9a4d-250e20b00d18-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cbf9560f-212f-460a-9a4d-250e20b00d18\") " pod="openstack/nova-metadata-0" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.969758 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l26tw\" (UniqueName: \"kubernetes.io/projected/cbf9560f-212f-460a-9a4d-250e20b00d18-kube-api-access-l26tw\") pod \"nova-metadata-0\" (UID: \"cbf9560f-212f-460a-9a4d-250e20b00d18\") " pod="openstack/nova-metadata-0" Mar 01 09:31:13 crc kubenswrapper[4792]: I0301 09:31:13.993236 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 01 09:31:14 crc kubenswrapper[4792]: I0301 09:31:14.581288 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 01 09:31:14 crc kubenswrapper[4792]: W0301 09:31:14.586998 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbf9560f_212f_460a_9a4d_250e20b00d18.slice/crio-6a2c86e35a00a104b3d5a7496abb4ae76c54acf5921be847bd8052b9a10ab207 WatchSource:0}: Error finding container 6a2c86e35a00a104b3d5a7496abb4ae76c54acf5921be847bd8052b9a10ab207: Status 404 returned error can't find the container with id 6a2c86e35a00a104b3d5a7496abb4ae76c54acf5921be847bd8052b9a10ab207 Mar 01 09:31:15 crc kubenswrapper[4792]: I0301 09:31:15.003423 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cbf9560f-212f-460a-9a4d-250e20b00d18","Type":"ContainerStarted","Data":"9217f2a680050cc383512f0d64b46639ceeeab526b10ba87c7de4c729b4ec997"} Mar 01 09:31:15 crc kubenswrapper[4792]: I0301 09:31:15.003766 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cbf9560f-212f-460a-9a4d-250e20b00d18","Type":"ContainerStarted","Data":"1e0244cd2b05cdedbcffd147075822a2efed61e61904b9966abbaf4c48301cc9"} Mar 01 09:31:15 crc kubenswrapper[4792]: I0301 09:31:15.003778 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cbf9560f-212f-460a-9a4d-250e20b00d18","Type":"ContainerStarted","Data":"6a2c86e35a00a104b3d5a7496abb4ae76c54acf5921be847bd8052b9a10ab207"} Mar 01 09:31:15 crc kubenswrapper[4792]: I0301 09:31:15.021077 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.021059748 podStartE2EDuration="2.021059748s" podCreationTimestamp="2026-03-01 09:31:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:31:15.020670509 +0000 UTC m=+1404.262549706" watchObservedRunningTime="2026-03-01 09:31:15.021059748 +0000 UTC m=+1404.262938945" Mar 01 09:31:15 crc kubenswrapper[4792]: I0301 09:31:15.420198 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66b40740-5f2c-4f3a-9d20-3307335829ed" path="/var/lib/kubelet/pods/66b40740-5f2c-4f3a-9d20-3307335829ed/volumes" Mar 01 09:31:16 crc kubenswrapper[4792]: I0301 09:31:16.334221 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 01 09:31:18 crc kubenswrapper[4792]: I0301 09:31:18.993628 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 01 09:31:18 crc kubenswrapper[4792]: I0301 09:31:18.994996 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 01 09:31:20 crc kubenswrapper[4792]: I0301 09:31:20.362896 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 01 09:31:20 crc kubenswrapper[4792]: I0301 09:31:20.364456 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 01 09:31:21 crc kubenswrapper[4792]: I0301 09:31:21.335209 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 01 09:31:21 crc kubenswrapper[4792]: I0301 09:31:21.357792 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 01 09:31:21 crc kubenswrapper[4792]: I0301 09:31:21.373136 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9c6de822-b7f5-4530-bb5b-ca879ff899fc" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.196:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 01 09:31:21 crc kubenswrapper[4792]: I0301 09:31:21.373163 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9c6de822-b7f5-4530-bb5b-ca879ff899fc" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.196:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 01 09:31:22 crc kubenswrapper[4792]: I0301 09:31:22.140532 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 01 09:31:23 crc kubenswrapper[4792]: I0301 09:31:23.993375 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 01 09:31:23 crc kubenswrapper[4792]: I0301 09:31:23.993689 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 01 09:31:25 crc kubenswrapper[4792]: I0301 09:31:25.005081 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="cbf9560f-212f-460a-9a4d-250e20b00d18" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.198:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 01 09:31:25 crc kubenswrapper[4792]: I0301 09:31:25.005082 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="cbf9560f-212f-460a-9a4d-250e20b00d18" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.198:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 01 09:31:27 crc kubenswrapper[4792]: I0301 09:31:27.175613 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 01 09:31:27 crc kubenswrapper[4792]: I0301 09:31:27.265698 4792 scope.go:117] "RemoveContainer" containerID="f4b48983a710c40494648dd6a515d3975deee5c28f7b927750a63de93e040785" Mar 01 09:31:27 crc kubenswrapper[4792]: I0301 09:31:27.298133 4792 scope.go:117] "RemoveContainer" containerID="8fa90defd7fd5e3f42f2b3e3f4e2672234268081fbce5bff10b4ec243b2afba1" Mar 01 09:31:27 crc kubenswrapper[4792]: I0301 09:31:27.328970 4792 scope.go:117] "RemoveContainer" containerID="2712402bbe2a400f1f172cc0f249c7e35edf7b64593d06c6d1cfd9d81ee06f57" Mar 01 09:31:30 crc kubenswrapper[4792]: I0301 09:31:30.371141 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 01 09:31:30 crc kubenswrapper[4792]: I0301 09:31:30.372568 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 01 09:31:30 crc kubenswrapper[4792]: I0301 09:31:30.373131 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 01 09:31:30 crc kubenswrapper[4792]: I0301 09:31:30.382410 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 01 09:31:31 crc kubenswrapper[4792]: I0301 09:31:31.140748 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 01 09:31:31 crc kubenswrapper[4792]: I0301 09:31:31.147872 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 01 09:31:33 crc kubenswrapper[4792]: I0301 09:31:33.999192 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 01 09:31:34 crc kubenswrapper[4792]: I0301 09:31:34.006049 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 01 09:31:34 crc kubenswrapper[4792]: I0301 09:31:34.007925 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 01 09:31:34 crc kubenswrapper[4792]: I0301 09:31:34.170459 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 01 09:31:41 crc kubenswrapper[4792]: I0301 09:31:41.303400 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 01 09:31:42 crc kubenswrapper[4792]: I0301 09:31:42.839776 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 01 09:31:46 crc kubenswrapper[4792]: I0301 09:31:46.872476 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="6252a079-917c-46e8-a848-10569e1e057e" containerName="rabbitmq" containerID="cri-o://e872a8250debe35b2b405169cd43bdbc962c34739bd277e35d8038f3fa166251" gracePeriod=604795 Mar 01 09:31:47 crc kubenswrapper[4792]: I0301 09:31:47.596132 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" containerName="rabbitmq" containerID="cri-o://763a17ce168e713296e67217a330ca93fd37d2fe6e80cda59899f22b0afab4c5" gracePeriod=604796 Mar 01 09:31:52 crc kubenswrapper[4792]: I0301 09:31:52.928120 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.356275 4792 generic.go:334] "Generic (PLEG): container finished" podID="6252a079-917c-46e8-a848-10569e1e057e" containerID="e872a8250debe35b2b405169cd43bdbc962c34739bd277e35d8038f3fa166251" exitCode=0 Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.356362 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6252a079-917c-46e8-a848-10569e1e057e","Type":"ContainerDied","Data":"e872a8250debe35b2b405169cd43bdbc962c34739bd277e35d8038f3fa166251"} Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.356577 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6252a079-917c-46e8-a848-10569e1e057e","Type":"ContainerDied","Data":"dda196254ac808b630ce64a84572b51fae2b42596880acb14f053e9eec301086"} Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.356592 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dda196254ac808b630ce64a84572b51fae2b42596880acb14f053e9eec301086" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.423262 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.599557 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6252a079-917c-46e8-a848-10569e1e057e-pod-info\") pod \"6252a079-917c-46e8-a848-10569e1e057e\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.599725 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"6252a079-917c-46e8-a848-10569e1e057e\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.599769 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6252a079-917c-46e8-a848-10569e1e057e-plugins-conf\") pod \"6252a079-917c-46e8-a848-10569e1e057e\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.599800 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6252a079-917c-46e8-a848-10569e1e057e-erlang-cookie-secret\") pod \"6252a079-917c-46e8-a848-10569e1e057e\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.599823 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-confd\") pod \"6252a079-917c-46e8-a848-10569e1e057e\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.599880 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-tls\") pod \"6252a079-917c-46e8-a848-10569e1e057e\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.599898 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnscv\" (UniqueName: \"kubernetes.io/projected/6252a079-917c-46e8-a848-10569e1e057e-kube-api-access-qnscv\") pod \"6252a079-917c-46e8-a848-10569e1e057e\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.599973 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-erlang-cookie\") pod \"6252a079-917c-46e8-a848-10569e1e057e\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.599990 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-plugins\") pod \"6252a079-917c-46e8-a848-10569e1e057e\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.600014 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6252a079-917c-46e8-a848-10569e1e057e-config-data\") pod \"6252a079-917c-46e8-a848-10569e1e057e\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.600035 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6252a079-917c-46e8-a848-10569e1e057e-server-conf\") pod \"6252a079-917c-46e8-a848-10569e1e057e\" (UID: \"6252a079-917c-46e8-a848-10569e1e057e\") " Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.601523 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "6252a079-917c-46e8-a848-10569e1e057e" (UID: "6252a079-917c-46e8-a848-10569e1e057e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.601592 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6252a079-917c-46e8-a848-10569e1e057e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "6252a079-917c-46e8-a848-10569e1e057e" (UID: "6252a079-917c-46e8-a848-10569e1e057e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.601594 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "6252a079-917c-46e8-a848-10569e1e057e" (UID: "6252a079-917c-46e8-a848-10569e1e057e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.610811 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/6252a079-917c-46e8-a848-10569e1e057e-pod-info" (OuterVolumeSpecName: "pod-info") pod "6252a079-917c-46e8-a848-10569e1e057e" (UID: "6252a079-917c-46e8-a848-10569e1e057e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.610860 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "6252a079-917c-46e8-a848-10569e1e057e" (UID: "6252a079-917c-46e8-a848-10569e1e057e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.611034 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "6252a079-917c-46e8-a848-10569e1e057e" (UID: "6252a079-917c-46e8-a848-10569e1e057e"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.613138 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6252a079-917c-46e8-a848-10569e1e057e-kube-api-access-qnscv" (OuterVolumeSpecName: "kube-api-access-qnscv") pod "6252a079-917c-46e8-a848-10569e1e057e" (UID: "6252a079-917c-46e8-a848-10569e1e057e"). InnerVolumeSpecName "kube-api-access-qnscv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.632326 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6252a079-917c-46e8-a848-10569e1e057e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "6252a079-917c-46e8-a848-10569e1e057e" (UID: "6252a079-917c-46e8-a848-10569e1e057e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.690401 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6252a079-917c-46e8-a848-10569e1e057e-config-data" (OuterVolumeSpecName: "config-data") pod "6252a079-917c-46e8-a848-10569e1e057e" (UID: "6252a079-917c-46e8-a848-10569e1e057e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.703426 4792 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.703488 4792 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6252a079-917c-46e8-a848-10569e1e057e-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.703502 4792 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6252a079-917c-46e8-a848-10569e1e057e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.703512 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.703520 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnscv\" (UniqueName: \"kubernetes.io/projected/6252a079-917c-46e8-a848-10569e1e057e-kube-api-access-qnscv\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.703528 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.703536 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.703543 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6252a079-917c-46e8-a848-10569e1e057e-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.703553 4792 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6252a079-917c-46e8-a848-10569e1e057e-pod-info\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.708123 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6252a079-917c-46e8-a848-10569e1e057e-server-conf" (OuterVolumeSpecName: "server-conf") pod "6252a079-917c-46e8-a848-10569e1e057e" (UID: "6252a079-917c-46e8-a848-10569e1e057e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.743328 4792 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.763598 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "6252a079-917c-46e8-a848-10569e1e057e" (UID: "6252a079-917c-46e8-a848-10569e1e057e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.804591 4792 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.804743 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6252a079-917c-46e8-a848-10569e1e057e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:53 crc kubenswrapper[4792]: I0301 09:31:53.804753 4792 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6252a079-917c-46e8-a848-10569e1e057e-server-conf\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.055163 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.218158 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-confd\") pod \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.218199 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.218253 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-server-conf\") pod \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.218297 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-pod-info\") pod \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.218318 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-tls\") pod \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.218377 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8fhc\" (UniqueName: \"kubernetes.io/projected/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-kube-api-access-q8fhc\") pod \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.218435 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-plugins\") pod \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.218479 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-erlang-cookie-secret\") pod \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.218509 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-erlang-cookie\") pod \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.218580 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-config-data\") pod \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.218610 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-plugins-conf\") pod \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\" (UID: \"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24\") " Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.219467 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" (UID: "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.219733 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" (UID: "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.221839 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" (UID: "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.224878 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-kube-api-access-q8fhc" (OuterVolumeSpecName: "kube-api-access-q8fhc") pod "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" (UID: "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24"). InnerVolumeSpecName "kube-api-access-q8fhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.224997 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" (UID: "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.226133 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" (UID: "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.235093 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-pod-info" (OuterVolumeSpecName: "pod-info") pod "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" (UID: "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.246065 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" (UID: "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.277218 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-config-data" (OuterVolumeSpecName: "config-data") pod "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" (UID: "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.320986 4792 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-pod-info\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.321015 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.321025 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8fhc\" (UniqueName: \"kubernetes.io/projected/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-kube-api-access-q8fhc\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.321191 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.321205 4792 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.321213 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.321222 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.321229 4792 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.321260 4792 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.321980 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-server-conf" (OuterVolumeSpecName: "server-conf") pod "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" (UID: "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.339152 4792 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.362525 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" (UID: "2d63e2d4-fa65-425a-8578-d7b9f8c5ba24"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.372488 4792 generic.go:334] "Generic (PLEG): container finished" podID="2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" containerID="763a17ce168e713296e67217a330ca93fd37d2fe6e80cda59899f22b0afab4c5" exitCode=0 Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.372578 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.372576 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24","Type":"ContainerDied","Data":"763a17ce168e713296e67217a330ca93fd37d2fe6e80cda59899f22b0afab4c5"} Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.372637 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2d63e2d4-fa65-425a-8578-d7b9f8c5ba24","Type":"ContainerDied","Data":"3e8de91b3c58261b32cbdb52401a16acdc8aa762850b0b7a587dfa85e98e1d6e"} Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.372657 4792 scope.go:117] "RemoveContainer" containerID="763a17ce168e713296e67217a330ca93fd37d2fe6e80cda59899f22b0afab4c5" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.373526 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.422547 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.422575 4792 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.422585 4792 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24-server-conf\") on node \"crc\" DevicePath \"\"" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.431865 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.442127 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.450855 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.474427 4792 scope.go:117] "RemoveContainer" containerID="6ee0d7e342e549ff7a0c75a829bbad1f0458089ec98c553779c523b140c0f36b" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.509998 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.531028 4792 scope.go:117] "RemoveContainer" containerID="763a17ce168e713296e67217a330ca93fd37d2fe6e80cda59899f22b0afab4c5" Mar 01 09:31:54 crc kubenswrapper[4792]: E0301 09:31:54.531483 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"763a17ce168e713296e67217a330ca93fd37d2fe6e80cda59899f22b0afab4c5\": container with ID starting with 763a17ce168e713296e67217a330ca93fd37d2fe6e80cda59899f22b0afab4c5 not found: ID does not exist" containerID="763a17ce168e713296e67217a330ca93fd37d2fe6e80cda59899f22b0afab4c5" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.531592 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"763a17ce168e713296e67217a330ca93fd37d2fe6e80cda59899f22b0afab4c5"} err="failed to get container status \"763a17ce168e713296e67217a330ca93fd37d2fe6e80cda59899f22b0afab4c5\": rpc error: code = NotFound desc = could not find container \"763a17ce168e713296e67217a330ca93fd37d2fe6e80cda59899f22b0afab4c5\": container with ID starting with 763a17ce168e713296e67217a330ca93fd37d2fe6e80cda59899f22b0afab4c5 not found: ID does not exist" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.531694 4792 scope.go:117] "RemoveContainer" containerID="6ee0d7e342e549ff7a0c75a829bbad1f0458089ec98c553779c523b140c0f36b" Mar 01 09:31:54 crc kubenswrapper[4792]: E0301 09:31:54.532193 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ee0d7e342e549ff7a0c75a829bbad1f0458089ec98c553779c523b140c0f36b\": container with ID starting with 6ee0d7e342e549ff7a0c75a829bbad1f0458089ec98c553779c523b140c0f36b not found: ID does not exist" containerID="6ee0d7e342e549ff7a0c75a829bbad1f0458089ec98c553779c523b140c0f36b" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.532323 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ee0d7e342e549ff7a0c75a829bbad1f0458089ec98c553779c523b140c0f36b"} err="failed to get container status \"6ee0d7e342e549ff7a0c75a829bbad1f0458089ec98c553779c523b140c0f36b\": rpc error: code = NotFound desc = could not find container \"6ee0d7e342e549ff7a0c75a829bbad1f0458089ec98c553779c523b140c0f36b\": container with ID starting with 6ee0d7e342e549ff7a0c75a829bbad1f0458089ec98c553779c523b140c0f36b not found: ID does not exist" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.547605 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 01 09:31:54 crc kubenswrapper[4792]: E0301 09:31:54.548012 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" containerName="rabbitmq" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.548031 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" containerName="rabbitmq" Mar 01 09:31:54 crc kubenswrapper[4792]: E0301 09:31:54.548057 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6252a079-917c-46e8-a848-10569e1e057e" containerName="rabbitmq" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.548063 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6252a079-917c-46e8-a848-10569e1e057e" containerName="rabbitmq" Mar 01 09:31:54 crc kubenswrapper[4792]: E0301 09:31:54.548074 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6252a079-917c-46e8-a848-10569e1e057e" containerName="setup-container" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.548080 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6252a079-917c-46e8-a848-10569e1e057e" containerName="setup-container" Mar 01 09:31:54 crc kubenswrapper[4792]: E0301 09:31:54.548089 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" containerName="setup-container" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.548095 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" containerName="setup-container" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.548253 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" containerName="rabbitmq" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.548267 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6252a079-917c-46e8-a848-10569e1e057e" containerName="rabbitmq" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.549232 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.552478 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.552799 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.553239 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.553786 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.553815 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.553814 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-5zwb6" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.553872 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.571603 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.575614 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.580705 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.580975 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.581155 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.581166 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.581003 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.581073 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-584kl" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.582244 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.588334 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.600412 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.625984 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/63658b27-63d9-4a0f-afca-3a3c245b9b9d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.626043 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/63658b27-63d9-4a0f-afca-3a3c245b9b9d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.626066 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/63658b27-63d9-4a0f-afca-3a3c245b9b9d-config-data\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.626089 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/63658b27-63d9-4a0f-afca-3a3c245b9b9d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.626116 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/63658b27-63d9-4a0f-afca-3a3c245b9b9d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.626140 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/63658b27-63d9-4a0f-afca-3a3c245b9b9d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.626160 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/63658b27-63d9-4a0f-afca-3a3c245b9b9d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.626183 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.626195 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/63658b27-63d9-4a0f-afca-3a3c245b9b9d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.626213 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/63658b27-63d9-4a0f-afca-3a3c245b9b9d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.626226 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nnp2\" (UniqueName: \"kubernetes.io/projected/63658b27-63d9-4a0f-afca-3a3c245b9b9d-kube-api-access-5nnp2\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.727982 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728290 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/63658b27-63d9-4a0f-afca-3a3c245b9b9d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728399 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728494 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728554 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/63658b27-63d9-4a0f-afca-3a3c245b9b9d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728578 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/63658b27-63d9-4a0f-afca-3a3c245b9b9d-config-data\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728601 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/63658b27-63d9-4a0f-afca-3a3c245b9b9d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728621 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728648 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhq6d\" (UniqueName: \"kubernetes.io/projected/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-kube-api-access-vhq6d\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728686 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728716 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/63658b27-63d9-4a0f-afca-3a3c245b9b9d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728740 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728774 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/63658b27-63d9-4a0f-afca-3a3c245b9b9d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728799 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728830 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/63658b27-63d9-4a0f-afca-3a3c245b9b9d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728856 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728882 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/63658b27-63d9-4a0f-afca-3a3c245b9b9d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728899 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728935 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/63658b27-63d9-4a0f-afca-3a3c245b9b9d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.728951 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nnp2\" (UniqueName: \"kubernetes.io/projected/63658b27-63d9-4a0f-afca-3a3c245b9b9d-kube-api-access-5nnp2\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.729038 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.729055 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.729307 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.729643 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/63658b27-63d9-4a0f-afca-3a3c245b9b9d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.729762 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/63658b27-63d9-4a0f-afca-3a3c245b9b9d-config-data\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.730049 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/63658b27-63d9-4a0f-afca-3a3c245b9b9d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.730243 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/63658b27-63d9-4a0f-afca-3a3c245b9b9d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.730522 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/63658b27-63d9-4a0f-afca-3a3c245b9b9d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.733224 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/63658b27-63d9-4a0f-afca-3a3c245b9b9d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.735695 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/63658b27-63d9-4a0f-afca-3a3c245b9b9d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.736193 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/63658b27-63d9-4a0f-afca-3a3c245b9b9d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.736745 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/63658b27-63d9-4a0f-afca-3a3c245b9b9d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.749104 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nnp2\" (UniqueName: \"kubernetes.io/projected/63658b27-63d9-4a0f-afca-3a3c245b9b9d-kube-api-access-5nnp2\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.763796 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"63658b27-63d9-4a0f-afca-3a3c245b9b9d\") " pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.830169 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.830237 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.830281 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.830317 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.830376 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.830793 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.830825 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.830922 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.830962 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.830986 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.831088 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.831116 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhq6d\" (UniqueName: \"kubernetes.io/projected/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-kube-api-access-vhq6d\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.831767 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.832103 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.832359 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.832770 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.833351 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.835490 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.835629 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.835944 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.836395 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.849276 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhq6d\" (UniqueName: \"kubernetes.io/projected/e0e1dd7a-6a53-446d-bf90-5813f7a3fda0-kube-api-access-vhq6d\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.858186 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0\") " pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.867325 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 01 09:31:54 crc kubenswrapper[4792]: I0301 09:31:54.892819 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:31:55 crc kubenswrapper[4792]: I0301 09:31:55.353352 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 01 09:31:55 crc kubenswrapper[4792]: W0301 09:31:55.361532 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63658b27_63d9_4a0f_afca_3a3c245b9b9d.slice/crio-f695ccbc44c0b0adbed26c0a4d27156ab9f6b736a46cdb18c00e990b0e4ef3c0 WatchSource:0}: Error finding container f695ccbc44c0b0adbed26c0a4d27156ab9f6b736a46cdb18c00e990b0e4ef3c0: Status 404 returned error can't find the container with id f695ccbc44c0b0adbed26c0a4d27156ab9f6b736a46cdb18c00e990b0e4ef3c0 Mar 01 09:31:55 crc kubenswrapper[4792]: I0301 09:31:55.381671 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"63658b27-63d9-4a0f-afca-3a3c245b9b9d","Type":"ContainerStarted","Data":"f695ccbc44c0b0adbed26c0a4d27156ab9f6b736a46cdb18c00e990b0e4ef3c0"} Mar 01 09:31:55 crc kubenswrapper[4792]: I0301 09:31:55.423375 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d63e2d4-fa65-425a-8578-d7b9f8c5ba24" path="/var/lib/kubelet/pods/2d63e2d4-fa65-425a-8578-d7b9f8c5ba24/volumes" Mar 01 09:31:55 crc kubenswrapper[4792]: I0301 09:31:55.424156 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6252a079-917c-46e8-a848-10569e1e057e" path="/var/lib/kubelet/pods/6252a079-917c-46e8-a848-10569e1e057e/volumes" Mar 01 09:31:55 crc kubenswrapper[4792]: I0301 09:31:55.454274 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 01 09:31:55 crc kubenswrapper[4792]: W0301 09:31:55.470600 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0e1dd7a_6a53_446d_bf90_5813f7a3fda0.slice/crio-a99e25d1f8fff02f96e102db5bcabc13c7b4a765ad26ce58ed52f48bec020deb WatchSource:0}: Error finding container a99e25d1f8fff02f96e102db5bcabc13c7b4a765ad26ce58ed52f48bec020deb: Status 404 returned error can't find the container with id a99e25d1f8fff02f96e102db5bcabc13c7b4a765ad26ce58ed52f48bec020deb Mar 01 09:31:56 crc kubenswrapper[4792]: I0301 09:31:56.392927 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0","Type":"ContainerStarted","Data":"a99e25d1f8fff02f96e102db5bcabc13c7b4a765ad26ce58ed52f48bec020deb"} Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.271601 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59c44489bc-lcb28"] Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.274026 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.279980 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.287052 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59c44489bc-lcb28"] Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.382535 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-openstack-edpm-ipam\") pod \"dnsmasq-dns-59c44489bc-lcb28\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.382627 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-ovsdbserver-sb\") pod \"dnsmasq-dns-59c44489bc-lcb28\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.382722 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-config\") pod \"dnsmasq-dns-59c44489bc-lcb28\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.382756 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9kw4\" (UniqueName: \"kubernetes.io/projected/241ce805-8049-491d-bdf9-eeadf9ea4080-kube-api-access-p9kw4\") pod \"dnsmasq-dns-59c44489bc-lcb28\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.382799 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-dns-svc\") pod \"dnsmasq-dns-59c44489bc-lcb28\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.382844 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-ovsdbserver-nb\") pod \"dnsmasq-dns-59c44489bc-lcb28\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.402015 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"63658b27-63d9-4a0f-afca-3a3c245b9b9d","Type":"ContainerStarted","Data":"2ab8068f1ad5a6321a051380bfb498b59d9b17d29b4726fc01b0c882c34ec764"} Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.404804 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0","Type":"ContainerStarted","Data":"a31d63c5aa5d050434ce6a4f0e9ec1378f77889b76bca1387d32b037ac0b6ede"} Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.484672 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-ovsdbserver-sb\") pod \"dnsmasq-dns-59c44489bc-lcb28\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.484808 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-config\") pod \"dnsmasq-dns-59c44489bc-lcb28\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.484840 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9kw4\" (UniqueName: \"kubernetes.io/projected/241ce805-8049-491d-bdf9-eeadf9ea4080-kube-api-access-p9kw4\") pod \"dnsmasq-dns-59c44489bc-lcb28\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.484979 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-dns-svc\") pod \"dnsmasq-dns-59c44489bc-lcb28\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.485085 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-ovsdbserver-nb\") pod \"dnsmasq-dns-59c44489bc-lcb28\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.485169 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-openstack-edpm-ipam\") pod \"dnsmasq-dns-59c44489bc-lcb28\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.486826 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-config\") pod \"dnsmasq-dns-59c44489bc-lcb28\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.487029 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-ovsdbserver-sb\") pod \"dnsmasq-dns-59c44489bc-lcb28\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.487745 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-dns-svc\") pod \"dnsmasq-dns-59c44489bc-lcb28\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.487781 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-ovsdbserver-nb\") pod \"dnsmasq-dns-59c44489bc-lcb28\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.488239 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-openstack-edpm-ipam\") pod \"dnsmasq-dns-59c44489bc-lcb28\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.508800 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9kw4\" (UniqueName: \"kubernetes.io/projected/241ce805-8049-491d-bdf9-eeadf9ea4080-kube-api-access-p9kw4\") pod \"dnsmasq-dns-59c44489bc-lcb28\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.597190 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:57 crc kubenswrapper[4792]: I0301 09:31:57.933244 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59c44489bc-lcb28"] Mar 01 09:31:58 crc kubenswrapper[4792]: I0301 09:31:58.232543 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="6252a079-917c-46e8-a848-10569e1e057e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: i/o timeout" Mar 01 09:31:58 crc kubenswrapper[4792]: I0301 09:31:58.413351 4792 generic.go:334] "Generic (PLEG): container finished" podID="241ce805-8049-491d-bdf9-eeadf9ea4080" containerID="05e67a0819aa5ed5e27ec53a14d250ee562957cc561df53697e3217eab454f4c" exitCode=0 Mar 01 09:31:58 crc kubenswrapper[4792]: I0301 09:31:58.413385 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59c44489bc-lcb28" event={"ID":"241ce805-8049-491d-bdf9-eeadf9ea4080","Type":"ContainerDied","Data":"05e67a0819aa5ed5e27ec53a14d250ee562957cc561df53697e3217eab454f4c"} Mar 01 09:31:58 crc kubenswrapper[4792]: I0301 09:31:58.413424 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59c44489bc-lcb28" event={"ID":"241ce805-8049-491d-bdf9-eeadf9ea4080","Type":"ContainerStarted","Data":"31553e15cd3569da29e5d2b0a1053e7e96e46dcf7276c14267ec5547dc01b54a"} Mar 01 09:31:59 crc kubenswrapper[4792]: I0301 09:31:59.422830 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59c44489bc-lcb28" event={"ID":"241ce805-8049-491d-bdf9-eeadf9ea4080","Type":"ContainerStarted","Data":"6d67b5704dbd88e3a06443bdd2c44be45361687723ea422a2d3f017f9864c781"} Mar 01 09:31:59 crc kubenswrapper[4792]: I0301 09:31:59.423133 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:31:59 crc kubenswrapper[4792]: I0301 09:31:59.451873 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59c44489bc-lcb28" podStartSLOduration=2.451854309 podStartE2EDuration="2.451854309s" podCreationTimestamp="2026-03-01 09:31:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:31:59.443009503 +0000 UTC m=+1448.684888700" watchObservedRunningTime="2026-03-01 09:31:59.451854309 +0000 UTC m=+1448.693733506" Mar 01 09:32:00 crc kubenswrapper[4792]: I0301 09:32:00.132417 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539292-vszff"] Mar 01 09:32:00 crc kubenswrapper[4792]: I0301 09:32:00.133580 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539292-vszff" Mar 01 09:32:00 crc kubenswrapper[4792]: I0301 09:32:00.137104 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:32:00 crc kubenswrapper[4792]: I0301 09:32:00.138477 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:32:00 crc kubenswrapper[4792]: I0301 09:32:00.138756 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:32:00 crc kubenswrapper[4792]: I0301 09:32:00.146858 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539292-vszff"] Mar 01 09:32:00 crc kubenswrapper[4792]: I0301 09:32:00.229362 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwrnp\" (UniqueName: \"kubernetes.io/projected/13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc-kube-api-access-kwrnp\") pod \"auto-csr-approver-29539292-vszff\" (UID: \"13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc\") " pod="openshift-infra/auto-csr-approver-29539292-vszff" Mar 01 09:32:00 crc kubenswrapper[4792]: I0301 09:32:00.331137 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwrnp\" (UniqueName: \"kubernetes.io/projected/13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc-kube-api-access-kwrnp\") pod \"auto-csr-approver-29539292-vszff\" (UID: \"13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc\") " pod="openshift-infra/auto-csr-approver-29539292-vszff" Mar 01 09:32:00 crc kubenswrapper[4792]: I0301 09:32:00.349636 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwrnp\" (UniqueName: \"kubernetes.io/projected/13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc-kube-api-access-kwrnp\") pod \"auto-csr-approver-29539292-vszff\" (UID: \"13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc\") " pod="openshift-infra/auto-csr-approver-29539292-vszff" Mar 01 09:32:00 crc kubenswrapper[4792]: I0301 09:32:00.502335 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539292-vszff" Mar 01 09:32:00 crc kubenswrapper[4792]: I0301 09:32:00.920583 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539292-vszff"] Mar 01 09:32:01 crc kubenswrapper[4792]: I0301 09:32:01.444920 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539292-vszff" event={"ID":"13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc","Type":"ContainerStarted","Data":"37a36460b8475148e96ac7c4dad0fb77508fa47b713a2dfd5024df5d2510ac5c"} Mar 01 09:32:03 crc kubenswrapper[4792]: I0301 09:32:03.464607 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539292-vszff" event={"ID":"13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc","Type":"ContainerStarted","Data":"8ffaa11ad79d37459635175d7fbda620c8204659760d7268eb92ed800bd1a03d"} Mar 01 09:32:03 crc kubenswrapper[4792]: I0301 09:32:03.484750 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29539292-vszff" podStartSLOduration=1.6695277339999999 podStartE2EDuration="3.484734774s" podCreationTimestamp="2026-03-01 09:32:00 +0000 UTC" firstStartedPulling="2026-03-01 09:32:00.923057737 +0000 UTC m=+1450.164936944" lastFinishedPulling="2026-03-01 09:32:02.738264777 +0000 UTC m=+1451.980143984" observedRunningTime="2026-03-01 09:32:03.479030761 +0000 UTC m=+1452.720909958" watchObservedRunningTime="2026-03-01 09:32:03.484734774 +0000 UTC m=+1452.726613981" Mar 01 09:32:04 crc kubenswrapper[4792]: I0301 09:32:04.476166 4792 generic.go:334] "Generic (PLEG): container finished" podID="13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc" containerID="8ffaa11ad79d37459635175d7fbda620c8204659760d7268eb92ed800bd1a03d" exitCode=0 Mar 01 09:32:04 crc kubenswrapper[4792]: I0301 09:32:04.476231 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539292-vszff" event={"ID":"13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc","Type":"ContainerDied","Data":"8ffaa11ad79d37459635175d7fbda620c8204659760d7268eb92ed800bd1a03d"} Mar 01 09:32:05 crc kubenswrapper[4792]: I0301 09:32:05.828039 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539292-vszff" Mar 01 09:32:05 crc kubenswrapper[4792]: I0301 09:32:05.953580 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwrnp\" (UniqueName: \"kubernetes.io/projected/13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc-kube-api-access-kwrnp\") pod \"13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc\" (UID: \"13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc\") " Mar 01 09:32:05 crc kubenswrapper[4792]: I0301 09:32:05.959185 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc-kube-api-access-kwrnp" (OuterVolumeSpecName: "kube-api-access-kwrnp") pod "13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc" (UID: "13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc"). InnerVolumeSpecName "kube-api-access-kwrnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:32:06 crc kubenswrapper[4792]: I0301 09:32:06.055796 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwrnp\" (UniqueName: \"kubernetes.io/projected/13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc-kube-api-access-kwrnp\") on node \"crc\" DevicePath \"\"" Mar 01 09:32:06 crc kubenswrapper[4792]: I0301 09:32:06.494738 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539292-vszff" event={"ID":"13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc","Type":"ContainerDied","Data":"37a36460b8475148e96ac7c4dad0fb77508fa47b713a2dfd5024df5d2510ac5c"} Mar 01 09:32:06 crc kubenswrapper[4792]: I0301 09:32:06.494782 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37a36460b8475148e96ac7c4dad0fb77508fa47b713a2dfd5024df5d2510ac5c" Mar 01 09:32:06 crc kubenswrapper[4792]: I0301 09:32:06.494840 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539292-vszff" Mar 01 09:32:06 crc kubenswrapper[4792]: I0301 09:32:06.555180 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539286-l47hq"] Mar 01 09:32:06 crc kubenswrapper[4792]: I0301 09:32:06.563824 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539286-l47hq"] Mar 01 09:32:07 crc kubenswrapper[4792]: I0301 09:32:07.418757 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2eeb77af-03ae-4e32-80a6-3c16ed5ef64e" path="/var/lib/kubelet/pods/2eeb77af-03ae-4e32-80a6-3c16ed5ef64e/volumes" Mar 01 09:32:07 crc kubenswrapper[4792]: I0301 09:32:07.599125 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:32:07 crc kubenswrapper[4792]: I0301 09:32:07.684014 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c74598c69-2pgch"] Mar 01 09:32:07 crc kubenswrapper[4792]: I0301 09:32:07.684217 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c74598c69-2pgch" podUID="3dc02dae-4469-4e20-aca1-c85d7e451b7f" containerName="dnsmasq-dns" containerID="cri-o://a09dbf98a35444aba5d4b5bed5ccc4f027fcae4773128cc7e35e38da6f0a3954" gracePeriod=10 Mar 01 09:32:07 crc kubenswrapper[4792]: I0301 09:32:07.853987 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cb7494899-9x44w"] Mar 01 09:32:07 crc kubenswrapper[4792]: E0301 09:32:07.854349 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc" containerName="oc" Mar 01 09:32:07 crc kubenswrapper[4792]: I0301 09:32:07.854363 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc" containerName="oc" Mar 01 09:32:07 crc kubenswrapper[4792]: I0301 09:32:07.854542 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc" containerName="oc" Mar 01 09:32:07 crc kubenswrapper[4792]: I0301 09:32:07.864362 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:07 crc kubenswrapper[4792]: I0301 09:32:07.871035 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb7494899-9x44w"] Mar 01 09:32:07 crc kubenswrapper[4792]: I0301 09:32:07.995582 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb7494899-9x44w\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:07 crc kubenswrapper[4792]: I0301 09:32:07.995674 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-ovsdbserver-nb\") pod \"dnsmasq-dns-cb7494899-9x44w\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:07 crc kubenswrapper[4792]: I0301 09:32:07.995701 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjnt9\" (UniqueName: \"kubernetes.io/projected/a3a408d8-0510-4867-8517-e609d614a5d2-kube-api-access-cjnt9\") pod \"dnsmasq-dns-cb7494899-9x44w\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:07 crc kubenswrapper[4792]: I0301 09:32:07.995735 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-dns-svc\") pod \"dnsmasq-dns-cb7494899-9x44w\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:07 crc kubenswrapper[4792]: I0301 09:32:07.995795 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-config\") pod \"dnsmasq-dns-cb7494899-9x44w\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:07 crc kubenswrapper[4792]: I0301 09:32:07.995829 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-ovsdbserver-sb\") pod \"dnsmasq-dns-cb7494899-9x44w\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.097324 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb7494899-9x44w\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.097416 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-ovsdbserver-nb\") pod \"dnsmasq-dns-cb7494899-9x44w\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.097437 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjnt9\" (UniqueName: \"kubernetes.io/projected/a3a408d8-0510-4867-8517-e609d614a5d2-kube-api-access-cjnt9\") pod \"dnsmasq-dns-cb7494899-9x44w\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.097458 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-dns-svc\") pod \"dnsmasq-dns-cb7494899-9x44w\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.097478 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-config\") pod \"dnsmasq-dns-cb7494899-9x44w\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.097501 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-ovsdbserver-sb\") pod \"dnsmasq-dns-cb7494899-9x44w\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.098396 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-ovsdbserver-sb\") pod \"dnsmasq-dns-cb7494899-9x44w\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.098420 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-ovsdbserver-nb\") pod \"dnsmasq-dns-cb7494899-9x44w\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.098786 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-config\") pod \"dnsmasq-dns-cb7494899-9x44w\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.099147 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb7494899-9x44w\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.099180 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-dns-svc\") pod \"dnsmasq-dns-cb7494899-9x44w\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.130137 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjnt9\" (UniqueName: \"kubernetes.io/projected/a3a408d8-0510-4867-8517-e609d614a5d2-kube-api-access-cjnt9\") pod \"dnsmasq-dns-cb7494899-9x44w\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.195256 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.206533 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.301270 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-ovsdbserver-sb\") pod \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.301519 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drdkm\" (UniqueName: \"kubernetes.io/projected/3dc02dae-4469-4e20-aca1-c85d7e451b7f-kube-api-access-drdkm\") pod \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.301570 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-config\") pod \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.301643 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-ovsdbserver-nb\") pod \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.301674 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-dns-svc\") pod \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\" (UID: \"3dc02dae-4469-4e20-aca1-c85d7e451b7f\") " Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.322939 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dc02dae-4469-4e20-aca1-c85d7e451b7f-kube-api-access-drdkm" (OuterVolumeSpecName: "kube-api-access-drdkm") pod "3dc02dae-4469-4e20-aca1-c85d7e451b7f" (UID: "3dc02dae-4469-4e20-aca1-c85d7e451b7f"). InnerVolumeSpecName "kube-api-access-drdkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.357300 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3dc02dae-4469-4e20-aca1-c85d7e451b7f" (UID: "3dc02dae-4469-4e20-aca1-c85d7e451b7f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.358850 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-config" (OuterVolumeSpecName: "config") pod "3dc02dae-4469-4e20-aca1-c85d7e451b7f" (UID: "3dc02dae-4469-4e20-aca1-c85d7e451b7f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.361534 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3dc02dae-4469-4e20-aca1-c85d7e451b7f" (UID: "3dc02dae-4469-4e20-aca1-c85d7e451b7f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.365574 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3dc02dae-4469-4e20-aca1-c85d7e451b7f" (UID: "3dc02dae-4469-4e20-aca1-c85d7e451b7f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.404938 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.404968 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.404982 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drdkm\" (UniqueName: \"kubernetes.io/projected/3dc02dae-4469-4e20-aca1-c85d7e451b7f-kube-api-access-drdkm\") on node \"crc\" DevicePath \"\"" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.404993 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.405006 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dc02dae-4469-4e20-aca1-c85d7e451b7f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.512595 4792 generic.go:334] "Generic (PLEG): container finished" podID="3dc02dae-4469-4e20-aca1-c85d7e451b7f" containerID="a09dbf98a35444aba5d4b5bed5ccc4f027fcae4773128cc7e35e38da6f0a3954" exitCode=0 Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.512637 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c74598c69-2pgch" event={"ID":"3dc02dae-4469-4e20-aca1-c85d7e451b7f","Type":"ContainerDied","Data":"a09dbf98a35444aba5d4b5bed5ccc4f027fcae4773128cc7e35e38da6f0a3954"} Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.512667 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c74598c69-2pgch" event={"ID":"3dc02dae-4469-4e20-aca1-c85d7e451b7f","Type":"ContainerDied","Data":"6c7fd81ed9af7972987f7c71c071457fa812a6cd3376e81094962a9f69854811"} Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.512687 4792 scope.go:117] "RemoveContainer" containerID="a09dbf98a35444aba5d4b5bed5ccc4f027fcae4773128cc7e35e38da6f0a3954" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.512831 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c74598c69-2pgch" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.536404 4792 scope.go:117] "RemoveContainer" containerID="e23f1b79cd43eb2011bd766bf49f85132d9c752379520cf1459e680d0fc6f48c" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.557385 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c74598c69-2pgch"] Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.564105 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c74598c69-2pgch"] Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.575702 4792 scope.go:117] "RemoveContainer" containerID="a09dbf98a35444aba5d4b5bed5ccc4f027fcae4773128cc7e35e38da6f0a3954" Mar 01 09:32:08 crc kubenswrapper[4792]: E0301 09:32:08.576449 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a09dbf98a35444aba5d4b5bed5ccc4f027fcae4773128cc7e35e38da6f0a3954\": container with ID starting with a09dbf98a35444aba5d4b5bed5ccc4f027fcae4773128cc7e35e38da6f0a3954 not found: ID does not exist" containerID="a09dbf98a35444aba5d4b5bed5ccc4f027fcae4773128cc7e35e38da6f0a3954" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.576475 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a09dbf98a35444aba5d4b5bed5ccc4f027fcae4773128cc7e35e38da6f0a3954"} err="failed to get container status \"a09dbf98a35444aba5d4b5bed5ccc4f027fcae4773128cc7e35e38da6f0a3954\": rpc error: code = NotFound desc = could not find container \"a09dbf98a35444aba5d4b5bed5ccc4f027fcae4773128cc7e35e38da6f0a3954\": container with ID starting with a09dbf98a35444aba5d4b5bed5ccc4f027fcae4773128cc7e35e38da6f0a3954 not found: ID does not exist" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.576496 4792 scope.go:117] "RemoveContainer" containerID="e23f1b79cd43eb2011bd766bf49f85132d9c752379520cf1459e680d0fc6f48c" Mar 01 09:32:08 crc kubenswrapper[4792]: E0301 09:32:08.577478 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e23f1b79cd43eb2011bd766bf49f85132d9c752379520cf1459e680d0fc6f48c\": container with ID starting with e23f1b79cd43eb2011bd766bf49f85132d9c752379520cf1459e680d0fc6f48c not found: ID does not exist" containerID="e23f1b79cd43eb2011bd766bf49f85132d9c752379520cf1459e680d0fc6f48c" Mar 01 09:32:08 crc kubenswrapper[4792]: I0301 09:32:08.577501 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e23f1b79cd43eb2011bd766bf49f85132d9c752379520cf1459e680d0fc6f48c"} err="failed to get container status \"e23f1b79cd43eb2011bd766bf49f85132d9c752379520cf1459e680d0fc6f48c\": rpc error: code = NotFound desc = could not find container \"e23f1b79cd43eb2011bd766bf49f85132d9c752379520cf1459e680d0fc6f48c\": container with ID starting with e23f1b79cd43eb2011bd766bf49f85132d9c752379520cf1459e680d0fc6f48c not found: ID does not exist" Mar 01 09:32:09 crc kubenswrapper[4792]: I0301 09:32:09.266691 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb7494899-9x44w"] Mar 01 09:32:09 crc kubenswrapper[4792]: W0301 09:32:09.280033 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3a408d8_0510_4867_8517_e609d614a5d2.slice/crio-3830dbf250a4bedcc384f157cb50c2184899780d10921fec2974544052904874 WatchSource:0}: Error finding container 3830dbf250a4bedcc384f157cb50c2184899780d10921fec2974544052904874: Status 404 returned error can't find the container with id 3830dbf250a4bedcc384f157cb50c2184899780d10921fec2974544052904874 Mar 01 09:32:09 crc kubenswrapper[4792]: I0301 09:32:09.421335 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dc02dae-4469-4e20-aca1-c85d7e451b7f" path="/var/lib/kubelet/pods/3dc02dae-4469-4e20-aca1-c85d7e451b7f/volumes" Mar 01 09:32:09 crc kubenswrapper[4792]: I0301 09:32:09.528758 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb7494899-9x44w" event={"ID":"a3a408d8-0510-4867-8517-e609d614a5d2","Type":"ContainerStarted","Data":"3830dbf250a4bedcc384f157cb50c2184899780d10921fec2974544052904874"} Mar 01 09:32:10 crc kubenswrapper[4792]: I0301 09:32:10.539055 4792 generic.go:334] "Generic (PLEG): container finished" podID="a3a408d8-0510-4867-8517-e609d614a5d2" containerID="9c10172b37a7ca756da9dc968292ad3961c9a1084ca116b725c3c50da7e6ecd8" exitCode=0 Mar 01 09:32:10 crc kubenswrapper[4792]: I0301 09:32:10.539145 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb7494899-9x44w" event={"ID":"a3a408d8-0510-4867-8517-e609d614a5d2","Type":"ContainerDied","Data":"9c10172b37a7ca756da9dc968292ad3961c9a1084ca116b725c3c50da7e6ecd8"} Mar 01 09:32:11 crc kubenswrapper[4792]: I0301 09:32:11.548088 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb7494899-9x44w" event={"ID":"a3a408d8-0510-4867-8517-e609d614a5d2","Type":"ContainerStarted","Data":"8b04ed1bd42aa863eef16fe08f4819a294c0cd44d80b6329225717f3d7d610c0"} Mar 01 09:32:11 crc kubenswrapper[4792]: I0301 09:32:11.548749 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:11 crc kubenswrapper[4792]: I0301 09:32:11.599112 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cb7494899-9x44w" podStartSLOduration=4.59909393 podStartE2EDuration="4.59909393s" podCreationTimestamp="2026-03-01 09:32:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:32:11.571484795 +0000 UTC m=+1460.813364042" watchObservedRunningTime="2026-03-01 09:32:11.59909393 +0000 UTC m=+1460.840973127" Mar 01 09:32:18 crc kubenswrapper[4792]: I0301 09:32:18.197076 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 09:32:18 crc kubenswrapper[4792]: I0301 09:32:18.289764 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59c44489bc-lcb28"] Mar 01 09:32:18 crc kubenswrapper[4792]: I0301 09:32:18.290102 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59c44489bc-lcb28" podUID="241ce805-8049-491d-bdf9-eeadf9ea4080" containerName="dnsmasq-dns" containerID="cri-o://6d67b5704dbd88e3a06443bdd2c44be45361687723ea422a2d3f017f9864c781" gracePeriod=10 Mar 01 09:32:18 crc kubenswrapper[4792]: I0301 09:32:18.622304 4792 generic.go:334] "Generic (PLEG): container finished" podID="241ce805-8049-491d-bdf9-eeadf9ea4080" containerID="6d67b5704dbd88e3a06443bdd2c44be45361687723ea422a2d3f017f9864c781" exitCode=0 Mar 01 09:32:18 crc kubenswrapper[4792]: I0301 09:32:18.622360 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59c44489bc-lcb28" event={"ID":"241ce805-8049-491d-bdf9-eeadf9ea4080","Type":"ContainerDied","Data":"6d67b5704dbd88e3a06443bdd2c44be45361687723ea422a2d3f017f9864c781"} Mar 01 09:32:18 crc kubenswrapper[4792]: I0301 09:32:18.777134 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:32:18 crc kubenswrapper[4792]: I0301 09:32:18.931131 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-config\") pod \"241ce805-8049-491d-bdf9-eeadf9ea4080\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " Mar 01 09:32:18 crc kubenswrapper[4792]: I0301 09:32:18.931206 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-dns-svc\") pod \"241ce805-8049-491d-bdf9-eeadf9ea4080\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " Mar 01 09:32:18 crc kubenswrapper[4792]: I0301 09:32:18.931253 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-ovsdbserver-sb\") pod \"241ce805-8049-491d-bdf9-eeadf9ea4080\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " Mar 01 09:32:18 crc kubenswrapper[4792]: I0301 09:32:18.931346 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-ovsdbserver-nb\") pod \"241ce805-8049-491d-bdf9-eeadf9ea4080\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " Mar 01 09:32:18 crc kubenswrapper[4792]: I0301 09:32:18.931414 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-openstack-edpm-ipam\") pod \"241ce805-8049-491d-bdf9-eeadf9ea4080\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " Mar 01 09:32:18 crc kubenswrapper[4792]: I0301 09:32:18.931431 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9kw4\" (UniqueName: \"kubernetes.io/projected/241ce805-8049-491d-bdf9-eeadf9ea4080-kube-api-access-p9kw4\") pod \"241ce805-8049-491d-bdf9-eeadf9ea4080\" (UID: \"241ce805-8049-491d-bdf9-eeadf9ea4080\") " Mar 01 09:32:18 crc kubenswrapper[4792]: I0301 09:32:18.954304 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/241ce805-8049-491d-bdf9-eeadf9ea4080-kube-api-access-p9kw4" (OuterVolumeSpecName: "kube-api-access-p9kw4") pod "241ce805-8049-491d-bdf9-eeadf9ea4080" (UID: "241ce805-8049-491d-bdf9-eeadf9ea4080"). InnerVolumeSpecName "kube-api-access-p9kw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:32:18 crc kubenswrapper[4792]: I0301 09:32:18.990298 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "241ce805-8049-491d-bdf9-eeadf9ea4080" (UID: "241ce805-8049-491d-bdf9-eeadf9ea4080"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:32:18 crc kubenswrapper[4792]: I0301 09:32:18.990631 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "241ce805-8049-491d-bdf9-eeadf9ea4080" (UID: "241ce805-8049-491d-bdf9-eeadf9ea4080"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:32:19 crc kubenswrapper[4792]: I0301 09:32:19.011405 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "241ce805-8049-491d-bdf9-eeadf9ea4080" (UID: "241ce805-8049-491d-bdf9-eeadf9ea4080"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:32:19 crc kubenswrapper[4792]: I0301 09:32:19.011412 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "241ce805-8049-491d-bdf9-eeadf9ea4080" (UID: "241ce805-8049-491d-bdf9-eeadf9ea4080"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:32:19 crc kubenswrapper[4792]: I0301 09:32:19.017415 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-config" (OuterVolumeSpecName: "config") pod "241ce805-8049-491d-bdf9-eeadf9ea4080" (UID: "241ce805-8049-491d-bdf9-eeadf9ea4080"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:32:19 crc kubenswrapper[4792]: I0301 09:32:19.033954 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 01 09:32:19 crc kubenswrapper[4792]: I0301 09:32:19.033985 4792 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:32:19 crc kubenswrapper[4792]: I0301 09:32:19.033995 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9kw4\" (UniqueName: \"kubernetes.io/projected/241ce805-8049-491d-bdf9-eeadf9ea4080-kube-api-access-p9kw4\") on node \"crc\" DevicePath \"\"" Mar 01 09:32:19 crc kubenswrapper[4792]: I0301 09:32:19.034012 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-config\") on node \"crc\" DevicePath \"\"" Mar 01 09:32:19 crc kubenswrapper[4792]: I0301 09:32:19.034024 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 01 09:32:19 crc kubenswrapper[4792]: I0301 09:32:19.034035 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/241ce805-8049-491d-bdf9-eeadf9ea4080-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 01 09:32:19 crc kubenswrapper[4792]: I0301 09:32:19.631643 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59c44489bc-lcb28" event={"ID":"241ce805-8049-491d-bdf9-eeadf9ea4080","Type":"ContainerDied","Data":"31553e15cd3569da29e5d2b0a1053e7e96e46dcf7276c14267ec5547dc01b54a"} Mar 01 09:32:19 crc kubenswrapper[4792]: I0301 09:32:19.631694 4792 scope.go:117] "RemoveContainer" containerID="6d67b5704dbd88e3a06443bdd2c44be45361687723ea422a2d3f017f9864c781" Mar 01 09:32:19 crc kubenswrapper[4792]: I0301 09:32:19.632674 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59c44489bc-lcb28" Mar 01 09:32:19 crc kubenswrapper[4792]: I0301 09:32:19.657095 4792 scope.go:117] "RemoveContainer" containerID="05e67a0819aa5ed5e27ec53a14d250ee562957cc561df53697e3217eab454f4c" Mar 01 09:32:19 crc kubenswrapper[4792]: I0301 09:32:19.660423 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59c44489bc-lcb28"] Mar 01 09:32:19 crc kubenswrapper[4792]: I0301 09:32:19.667586 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59c44489bc-lcb28"] Mar 01 09:32:21 crc kubenswrapper[4792]: I0301 09:32:21.418625 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="241ce805-8049-491d-bdf9-eeadf9ea4080" path="/var/lib/kubelet/pods/241ce805-8049-491d-bdf9-eeadf9ea4080/volumes" Mar 01 09:32:27 crc kubenswrapper[4792]: I0301 09:32:27.534783 4792 scope.go:117] "RemoveContainer" containerID="30d57fe1f686a0e7d648422ad7801f657bc274b2e9502cf906d12a5e85e207f4" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.350877 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln"] Mar 01 09:32:28 crc kubenswrapper[4792]: E0301 09:32:28.352024 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dc02dae-4469-4e20-aca1-c85d7e451b7f" containerName="init" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.352049 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dc02dae-4469-4e20-aca1-c85d7e451b7f" containerName="init" Mar 01 09:32:28 crc kubenswrapper[4792]: E0301 09:32:28.352085 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="241ce805-8049-491d-bdf9-eeadf9ea4080" containerName="dnsmasq-dns" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.352094 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="241ce805-8049-491d-bdf9-eeadf9ea4080" containerName="dnsmasq-dns" Mar 01 09:32:28 crc kubenswrapper[4792]: E0301 09:32:28.352111 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dc02dae-4469-4e20-aca1-c85d7e451b7f" containerName="dnsmasq-dns" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.352121 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dc02dae-4469-4e20-aca1-c85d7e451b7f" containerName="dnsmasq-dns" Mar 01 09:32:28 crc kubenswrapper[4792]: E0301 09:32:28.352141 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="241ce805-8049-491d-bdf9-eeadf9ea4080" containerName="init" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.352150 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="241ce805-8049-491d-bdf9-eeadf9ea4080" containerName="init" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.352574 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dc02dae-4469-4e20-aca1-c85d7e451b7f" containerName="dnsmasq-dns" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.352610 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="241ce805-8049-491d-bdf9-eeadf9ea4080" containerName="dnsmasq-dns" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.353670 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.358608 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.358811 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.372495 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.372786 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.387615 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln"] Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.503299 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54e68c85-54c7-4855-b4a0-a85d2014c7b7-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln\" (UID: \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.503349 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/54e68c85-54c7-4855-b4a0-a85d2014c7b7-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln\" (UID: \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.503397 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/54e68c85-54c7-4855-b4a0-a85d2014c7b7-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln\" (UID: \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.503632 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p25kk\" (UniqueName: \"kubernetes.io/projected/54e68c85-54c7-4855-b4a0-a85d2014c7b7-kube-api-access-p25kk\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln\" (UID: \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.605200 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54e68c85-54c7-4855-b4a0-a85d2014c7b7-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln\" (UID: \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.605261 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/54e68c85-54c7-4855-b4a0-a85d2014c7b7-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln\" (UID: \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.605336 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/54e68c85-54c7-4855-b4a0-a85d2014c7b7-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln\" (UID: \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.605379 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p25kk\" (UniqueName: \"kubernetes.io/projected/54e68c85-54c7-4855-b4a0-a85d2014c7b7-kube-api-access-p25kk\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln\" (UID: \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.611960 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/54e68c85-54c7-4855-b4a0-a85d2014c7b7-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln\" (UID: \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.616484 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/54e68c85-54c7-4855-b4a0-a85d2014c7b7-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln\" (UID: \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.619100 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54e68c85-54c7-4855-b4a0-a85d2014c7b7-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln\" (UID: \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.619697 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p25kk\" (UniqueName: \"kubernetes.io/projected/54e68c85-54c7-4855-b4a0-a85d2014c7b7-kube-api-access-p25kk\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln\" (UID: \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" Mar 01 09:32:28 crc kubenswrapper[4792]: I0301 09:32:28.677712 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" Mar 01 09:32:29 crc kubenswrapper[4792]: I0301 09:32:29.180987 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln"] Mar 01 09:32:29 crc kubenswrapper[4792]: I0301 09:32:29.736629 4792 generic.go:334] "Generic (PLEG): container finished" podID="63658b27-63d9-4a0f-afca-3a3c245b9b9d" containerID="2ab8068f1ad5a6321a051380bfb498b59d9b17d29b4726fc01b0c882c34ec764" exitCode=0 Mar 01 09:32:29 crc kubenswrapper[4792]: I0301 09:32:29.736740 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"63658b27-63d9-4a0f-afca-3a3c245b9b9d","Type":"ContainerDied","Data":"2ab8068f1ad5a6321a051380bfb498b59d9b17d29b4726fc01b0c882c34ec764"} Mar 01 09:32:29 crc kubenswrapper[4792]: I0301 09:32:29.739631 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" event={"ID":"54e68c85-54c7-4855-b4a0-a85d2014c7b7","Type":"ContainerStarted","Data":"d567265e3ca5a368f3df31052aed27c28d8ae817e07fd00d4fefda5d841711b8"} Mar 01 09:32:29 crc kubenswrapper[4792]: I0301 09:32:29.741830 4792 generic.go:334] "Generic (PLEG): container finished" podID="e0e1dd7a-6a53-446d-bf90-5813f7a3fda0" containerID="a31d63c5aa5d050434ce6a4f0e9ec1378f77889b76bca1387d32b037ac0b6ede" exitCode=0 Mar 01 09:32:29 crc kubenswrapper[4792]: I0301 09:32:29.742004 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0","Type":"ContainerDied","Data":"a31d63c5aa5d050434ce6a4f0e9ec1378f77889b76bca1387d32b037ac0b6ede"} Mar 01 09:32:30 crc kubenswrapper[4792]: I0301 09:32:30.755749 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"63658b27-63d9-4a0f-afca-3a3c245b9b9d","Type":"ContainerStarted","Data":"d696bdd63d2cda6818c556f518a2d022a5026e40dcd06f6b79dce7c77e643e51"} Mar 01 09:32:30 crc kubenswrapper[4792]: I0301 09:32:30.756606 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 01 09:32:30 crc kubenswrapper[4792]: I0301 09:32:30.760967 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e0e1dd7a-6a53-446d-bf90-5813f7a3fda0","Type":"ContainerStarted","Data":"bd13573e587f769a572a7e418917f49e78ffad74eac437c621b579cbfab272e8"} Mar 01 09:32:30 crc kubenswrapper[4792]: I0301 09:32:30.761871 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:32:30 crc kubenswrapper[4792]: I0301 09:32:30.784447 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.784429925 podStartE2EDuration="36.784429925s" podCreationTimestamp="2026-03-01 09:31:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:32:30.781504167 +0000 UTC m=+1480.023383364" watchObservedRunningTime="2026-03-01 09:32:30.784429925 +0000 UTC m=+1480.026309122" Mar 01 09:32:31 crc kubenswrapper[4792]: I0301 09:32:31.435223 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.435201019 podStartE2EDuration="37.435201019s" podCreationTimestamp="2026-03-01 09:31:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 09:32:30.806586862 +0000 UTC m=+1480.048466079" watchObservedRunningTime="2026-03-01 09:32:31.435201019 +0000 UTC m=+1480.677080206" Mar 01 09:32:41 crc kubenswrapper[4792]: I0301 09:32:41.850774 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" event={"ID":"54e68c85-54c7-4855-b4a0-a85d2014c7b7","Type":"ContainerStarted","Data":"5773fb565f7454c83bd5a97647f2258db8af4721a7199d6a6eb816399f3a0abe"} Mar 01 09:32:44 crc kubenswrapper[4792]: I0301 09:32:44.872086 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 01 09:32:44 crc kubenswrapper[4792]: I0301 09:32:44.895132 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 01 09:32:44 crc kubenswrapper[4792]: I0301 09:32:44.905673 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" podStartSLOduration=5.374944135 podStartE2EDuration="16.905652483s" podCreationTimestamp="2026-03-01 09:32:28 +0000 UTC" firstStartedPulling="2026-03-01 09:32:29.182630847 +0000 UTC m=+1478.424510044" lastFinishedPulling="2026-03-01 09:32:40.713339195 +0000 UTC m=+1489.955218392" observedRunningTime="2026-03-01 09:32:41.880041104 +0000 UTC m=+1491.121920371" watchObservedRunningTime="2026-03-01 09:32:44.905652483 +0000 UTC m=+1494.147531700" Mar 01 09:32:45 crc kubenswrapper[4792]: I0301 09:32:45.561797 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8np4j"] Mar 01 09:32:45 crc kubenswrapper[4792]: I0301 09:32:45.564168 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8np4j" Mar 01 09:32:45 crc kubenswrapper[4792]: I0301 09:32:45.629099 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8np4j"] Mar 01 09:32:45 crc kubenswrapper[4792]: I0301 09:32:45.702930 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9469s\" (UniqueName: \"kubernetes.io/projected/6ade9641-e262-4086-b871-3d010d48a86a-kube-api-access-9469s\") pod \"redhat-operators-8np4j\" (UID: \"6ade9641-e262-4086-b871-3d010d48a86a\") " pod="openshift-marketplace/redhat-operators-8np4j" Mar 01 09:32:45 crc kubenswrapper[4792]: I0301 09:32:45.702993 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ade9641-e262-4086-b871-3d010d48a86a-utilities\") pod \"redhat-operators-8np4j\" (UID: \"6ade9641-e262-4086-b871-3d010d48a86a\") " pod="openshift-marketplace/redhat-operators-8np4j" Mar 01 09:32:45 crc kubenswrapper[4792]: I0301 09:32:45.703042 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ade9641-e262-4086-b871-3d010d48a86a-catalog-content\") pod \"redhat-operators-8np4j\" (UID: \"6ade9641-e262-4086-b871-3d010d48a86a\") " pod="openshift-marketplace/redhat-operators-8np4j" Mar 01 09:32:45 crc kubenswrapper[4792]: I0301 09:32:45.804777 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9469s\" (UniqueName: \"kubernetes.io/projected/6ade9641-e262-4086-b871-3d010d48a86a-kube-api-access-9469s\") pod \"redhat-operators-8np4j\" (UID: \"6ade9641-e262-4086-b871-3d010d48a86a\") " pod="openshift-marketplace/redhat-operators-8np4j" Mar 01 09:32:45 crc kubenswrapper[4792]: I0301 09:32:45.804848 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ade9641-e262-4086-b871-3d010d48a86a-utilities\") pod \"redhat-operators-8np4j\" (UID: \"6ade9641-e262-4086-b871-3d010d48a86a\") " pod="openshift-marketplace/redhat-operators-8np4j" Mar 01 09:32:45 crc kubenswrapper[4792]: I0301 09:32:45.804933 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ade9641-e262-4086-b871-3d010d48a86a-catalog-content\") pod \"redhat-operators-8np4j\" (UID: \"6ade9641-e262-4086-b871-3d010d48a86a\") " pod="openshift-marketplace/redhat-operators-8np4j" Mar 01 09:32:45 crc kubenswrapper[4792]: I0301 09:32:45.805441 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ade9641-e262-4086-b871-3d010d48a86a-utilities\") pod \"redhat-operators-8np4j\" (UID: \"6ade9641-e262-4086-b871-3d010d48a86a\") " pod="openshift-marketplace/redhat-operators-8np4j" Mar 01 09:32:45 crc kubenswrapper[4792]: I0301 09:32:45.805446 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ade9641-e262-4086-b871-3d010d48a86a-catalog-content\") pod \"redhat-operators-8np4j\" (UID: \"6ade9641-e262-4086-b871-3d010d48a86a\") " pod="openshift-marketplace/redhat-operators-8np4j" Mar 01 09:32:45 crc kubenswrapper[4792]: I0301 09:32:45.824947 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9469s\" (UniqueName: \"kubernetes.io/projected/6ade9641-e262-4086-b871-3d010d48a86a-kube-api-access-9469s\") pod \"redhat-operators-8np4j\" (UID: \"6ade9641-e262-4086-b871-3d010d48a86a\") " pod="openshift-marketplace/redhat-operators-8np4j" Mar 01 09:32:45 crc kubenswrapper[4792]: I0301 09:32:45.884838 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8np4j" Mar 01 09:32:46 crc kubenswrapper[4792]: I0301 09:32:46.444921 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8np4j"] Mar 01 09:32:46 crc kubenswrapper[4792]: I0301 09:32:46.888500 4792 generic.go:334] "Generic (PLEG): container finished" podID="6ade9641-e262-4086-b871-3d010d48a86a" containerID="80ca295b5ffcae69a52408094affab38679155b5f4a8145c7ff37412f7ceb954" exitCode=0 Mar 01 09:32:46 crc kubenswrapper[4792]: I0301 09:32:46.888606 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8np4j" event={"ID":"6ade9641-e262-4086-b871-3d010d48a86a","Type":"ContainerDied","Data":"80ca295b5ffcae69a52408094affab38679155b5f4a8145c7ff37412f7ceb954"} Mar 01 09:32:46 crc kubenswrapper[4792]: I0301 09:32:46.888778 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8np4j" event={"ID":"6ade9641-e262-4086-b871-3d010d48a86a","Type":"ContainerStarted","Data":"0191adc5316cf18ee8f837f299248bbe54008a3de338797be3e29fa3dacc63bb"} Mar 01 09:32:47 crc kubenswrapper[4792]: I0301 09:32:47.908021 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8np4j" event={"ID":"6ade9641-e262-4086-b871-3d010d48a86a","Type":"ContainerStarted","Data":"f3ae7ab1c5a82ee5a1b41247907dc695e85724c88c6834ad6eff45f31feeda9e"} Mar 01 09:32:52 crc kubenswrapper[4792]: I0301 09:32:52.955044 4792 generic.go:334] "Generic (PLEG): container finished" podID="54e68c85-54c7-4855-b4a0-a85d2014c7b7" containerID="5773fb565f7454c83bd5a97647f2258db8af4721a7199d6a6eb816399f3a0abe" exitCode=0 Mar 01 09:32:52 crc kubenswrapper[4792]: I0301 09:32:52.955109 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" event={"ID":"54e68c85-54c7-4855-b4a0-a85d2014c7b7","Type":"ContainerDied","Data":"5773fb565f7454c83bd5a97647f2258db8af4721a7199d6a6eb816399f3a0abe"} Mar 01 09:32:54 crc kubenswrapper[4792]: I0301 09:32:54.543381 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" Mar 01 09:32:54 crc kubenswrapper[4792]: I0301 09:32:54.599777 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54e68c85-54c7-4855-b4a0-a85d2014c7b7-repo-setup-combined-ca-bundle\") pod \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\" (UID: \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\") " Mar 01 09:32:54 crc kubenswrapper[4792]: I0301 09:32:54.600135 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/54e68c85-54c7-4855-b4a0-a85d2014c7b7-inventory\") pod \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\" (UID: \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\") " Mar 01 09:32:54 crc kubenswrapper[4792]: I0301 09:32:54.600421 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/54e68c85-54c7-4855-b4a0-a85d2014c7b7-ssh-key-openstack-edpm-ipam\") pod \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\" (UID: \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\") " Mar 01 09:32:54 crc kubenswrapper[4792]: I0301 09:32:54.601022 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p25kk\" (UniqueName: \"kubernetes.io/projected/54e68c85-54c7-4855-b4a0-a85d2014c7b7-kube-api-access-p25kk\") pod \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\" (UID: \"54e68c85-54c7-4855-b4a0-a85d2014c7b7\") " Mar 01 09:32:54 crc kubenswrapper[4792]: I0301 09:32:54.607454 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54e68c85-54c7-4855-b4a0-a85d2014c7b7-kube-api-access-p25kk" (OuterVolumeSpecName: "kube-api-access-p25kk") pod "54e68c85-54c7-4855-b4a0-a85d2014c7b7" (UID: "54e68c85-54c7-4855-b4a0-a85d2014c7b7"). InnerVolumeSpecName "kube-api-access-p25kk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:32:54 crc kubenswrapper[4792]: I0301 09:32:54.623146 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54e68c85-54c7-4855-b4a0-a85d2014c7b7-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "54e68c85-54c7-4855-b4a0-a85d2014c7b7" (UID: "54e68c85-54c7-4855-b4a0-a85d2014c7b7"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:32:54 crc kubenswrapper[4792]: I0301 09:32:54.632615 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54e68c85-54c7-4855-b4a0-a85d2014c7b7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "54e68c85-54c7-4855-b4a0-a85d2014c7b7" (UID: "54e68c85-54c7-4855-b4a0-a85d2014c7b7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:32:54 crc kubenswrapper[4792]: I0301 09:32:54.650817 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54e68c85-54c7-4855-b4a0-a85d2014c7b7-inventory" (OuterVolumeSpecName: "inventory") pod "54e68c85-54c7-4855-b4a0-a85d2014c7b7" (UID: "54e68c85-54c7-4855-b4a0-a85d2014c7b7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:32:54 crc kubenswrapper[4792]: I0301 09:32:54.704586 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/54e68c85-54c7-4855-b4a0-a85d2014c7b7-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:32:54 crc kubenswrapper[4792]: I0301 09:32:54.704662 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/54e68c85-54c7-4855-b4a0-a85d2014c7b7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:32:54 crc kubenswrapper[4792]: I0301 09:32:54.704689 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p25kk\" (UniqueName: \"kubernetes.io/projected/54e68c85-54c7-4855-b4a0-a85d2014c7b7-kube-api-access-p25kk\") on node \"crc\" DevicePath \"\"" Mar 01 09:32:54 crc kubenswrapper[4792]: I0301 09:32:54.704709 4792 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54e68c85-54c7-4855-b4a0-a85d2014c7b7-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:32:54 crc kubenswrapper[4792]: I0301 09:32:54.974520 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" event={"ID":"54e68c85-54c7-4855-b4a0-a85d2014c7b7","Type":"ContainerDied","Data":"d567265e3ca5a368f3df31052aed27c28d8ae817e07fd00d4fefda5d841711b8"} Mar 01 09:32:54 crc kubenswrapper[4792]: I0301 09:32:54.974566 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d567265e3ca5a368f3df31052aed27c28d8ae817e07fd00d4fefda5d841711b8" Mar 01 09:32:54 crc kubenswrapper[4792]: I0301 09:32:54.974536 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln" Mar 01 09:32:54 crc kubenswrapper[4792]: I0301 09:32:54.977154 4792 generic.go:334] "Generic (PLEG): container finished" podID="6ade9641-e262-4086-b871-3d010d48a86a" containerID="f3ae7ab1c5a82ee5a1b41247907dc695e85724c88c6834ad6eff45f31feeda9e" exitCode=0 Mar 01 09:32:54 crc kubenswrapper[4792]: I0301 09:32:54.977190 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8np4j" event={"ID":"6ade9641-e262-4086-b871-3d010d48a86a","Type":"ContainerDied","Data":"f3ae7ab1c5a82ee5a1b41247907dc695e85724c88c6834ad6eff45f31feeda9e"} Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.084825 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs"] Mar 01 09:32:55 crc kubenswrapper[4792]: E0301 09:32:55.085211 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54e68c85-54c7-4855-b4a0-a85d2014c7b7" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.085228 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="54e68c85-54c7-4855-b4a0-a85d2014c7b7" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.085389 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="54e68c85-54c7-4855-b4a0-a85d2014c7b7" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.085955 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.088687 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.089158 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.089451 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.089630 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.101332 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs"] Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.109274 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a742181-aebe-42f8-a83e-fee7b480366b-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs\" (UID: \"4a742181-aebe-42f8-a83e-fee7b480366b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.109473 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a742181-aebe-42f8-a83e-fee7b480366b-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs\" (UID: \"4a742181-aebe-42f8-a83e-fee7b480366b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.109559 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlgpb\" (UniqueName: \"kubernetes.io/projected/4a742181-aebe-42f8-a83e-fee7b480366b-kube-api-access-xlgpb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs\" (UID: \"4a742181-aebe-42f8-a83e-fee7b480366b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.109635 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4a742181-aebe-42f8-a83e-fee7b480366b-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs\" (UID: \"4a742181-aebe-42f8-a83e-fee7b480366b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.211285 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlgpb\" (UniqueName: \"kubernetes.io/projected/4a742181-aebe-42f8-a83e-fee7b480366b-kube-api-access-xlgpb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs\" (UID: \"4a742181-aebe-42f8-a83e-fee7b480366b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.211363 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4a742181-aebe-42f8-a83e-fee7b480366b-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs\" (UID: \"4a742181-aebe-42f8-a83e-fee7b480366b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.211601 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a742181-aebe-42f8-a83e-fee7b480366b-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs\" (UID: \"4a742181-aebe-42f8-a83e-fee7b480366b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.211654 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a742181-aebe-42f8-a83e-fee7b480366b-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs\" (UID: \"4a742181-aebe-42f8-a83e-fee7b480366b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.215686 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a742181-aebe-42f8-a83e-fee7b480366b-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs\" (UID: \"4a742181-aebe-42f8-a83e-fee7b480366b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.215870 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4a742181-aebe-42f8-a83e-fee7b480366b-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs\" (UID: \"4a742181-aebe-42f8-a83e-fee7b480366b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.216253 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a742181-aebe-42f8-a83e-fee7b480366b-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs\" (UID: \"4a742181-aebe-42f8-a83e-fee7b480366b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.235285 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlgpb\" (UniqueName: \"kubernetes.io/projected/4a742181-aebe-42f8-a83e-fee7b480366b-kube-api-access-xlgpb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs\" (UID: \"4a742181-aebe-42f8-a83e-fee7b480366b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.407139 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" Mar 01 09:32:55 crc kubenswrapper[4792]: I0301 09:32:55.988658 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8np4j" event={"ID":"6ade9641-e262-4086-b871-3d010d48a86a","Type":"ContainerStarted","Data":"14aff226ae1ef4a69e0db8d441089bf4ab4d70fd98e04b21c3691c5e2b4594f0"} Mar 01 09:32:56 crc kubenswrapper[4792]: I0301 09:32:56.014551 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8np4j" podStartSLOduration=2.533332442 podStartE2EDuration="11.014533512s" podCreationTimestamp="2026-03-01 09:32:45 +0000 UTC" firstStartedPulling="2026-03-01 09:32:46.890679467 +0000 UTC m=+1496.132558664" lastFinishedPulling="2026-03-01 09:32:55.371880537 +0000 UTC m=+1504.613759734" observedRunningTime="2026-03-01 09:32:56.013470907 +0000 UTC m=+1505.255350104" watchObservedRunningTime="2026-03-01 09:32:56.014533512 +0000 UTC m=+1505.256412709" Mar 01 09:32:56 crc kubenswrapper[4792]: I0301 09:32:56.035212 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs"] Mar 01 09:32:56 crc kubenswrapper[4792]: W0301 09:32:56.041090 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a742181_aebe_42f8_a83e_fee7b480366b.slice/crio-b08f4fb0c894f674888aac3a423e143dc1d90f11637cf92b6939cf3c1832d890 WatchSource:0}: Error finding container b08f4fb0c894f674888aac3a423e143dc1d90f11637cf92b6939cf3c1832d890: Status 404 returned error can't find the container with id b08f4fb0c894f674888aac3a423e143dc1d90f11637cf92b6939cf3c1832d890 Mar 01 09:32:56 crc kubenswrapper[4792]: I0301 09:32:56.996543 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" event={"ID":"4a742181-aebe-42f8-a83e-fee7b480366b","Type":"ContainerStarted","Data":"1fcee8427ea6340db8e69cb0e43a52de1fe2f18dc84e960d49fc0b0918052c29"} Mar 01 09:32:56 crc kubenswrapper[4792]: I0301 09:32:56.997067 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" event={"ID":"4a742181-aebe-42f8-a83e-fee7b480366b","Type":"ContainerStarted","Data":"b08f4fb0c894f674888aac3a423e143dc1d90f11637cf92b6939cf3c1832d890"} Mar 01 09:32:57 crc kubenswrapper[4792]: I0301 09:32:57.014847 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" podStartSLOduration=1.6561432200000001 podStartE2EDuration="2.014829335s" podCreationTimestamp="2026-03-01 09:32:55 +0000 UTC" firstStartedPulling="2026-03-01 09:32:56.043673522 +0000 UTC m=+1505.285552719" lastFinishedPulling="2026-03-01 09:32:56.402359637 +0000 UTC m=+1505.644238834" observedRunningTime="2026-03-01 09:32:57.012536021 +0000 UTC m=+1506.254415218" watchObservedRunningTime="2026-03-01 09:32:57.014829335 +0000 UTC m=+1506.256708532" Mar 01 09:33:04 crc kubenswrapper[4792]: I0301 09:33:04.943248 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:33:04 crc kubenswrapper[4792]: I0301 09:33:04.943787 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:33:05 crc kubenswrapper[4792]: I0301 09:33:05.885617 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8np4j" Mar 01 09:33:05 crc kubenswrapper[4792]: I0301 09:33:05.885665 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8np4j" Mar 01 09:33:06 crc kubenswrapper[4792]: I0301 09:33:06.929038 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8np4j" podUID="6ade9641-e262-4086-b871-3d010d48a86a" containerName="registry-server" probeResult="failure" output=< Mar 01 09:33:06 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 09:33:06 crc kubenswrapper[4792]: > Mar 01 09:33:16 crc kubenswrapper[4792]: I0301 09:33:16.929073 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8np4j" podUID="6ade9641-e262-4086-b871-3d010d48a86a" containerName="registry-server" probeResult="failure" output=< Mar 01 09:33:16 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 09:33:16 crc kubenswrapper[4792]: > Mar 01 09:33:25 crc kubenswrapper[4792]: I0301 09:33:25.931447 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8np4j" Mar 01 09:33:25 crc kubenswrapper[4792]: I0301 09:33:25.979897 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8np4j" Mar 01 09:33:26 crc kubenswrapper[4792]: I0301 09:33:26.187238 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8np4j"] Mar 01 09:33:27 crc kubenswrapper[4792]: I0301 09:33:27.280117 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8np4j" podUID="6ade9641-e262-4086-b871-3d010d48a86a" containerName="registry-server" containerID="cri-o://14aff226ae1ef4a69e0db8d441089bf4ab4d70fd98e04b21c3691c5e2b4594f0" gracePeriod=2 Mar 01 09:33:27 crc kubenswrapper[4792]: I0301 09:33:27.637213 4792 scope.go:117] "RemoveContainer" containerID="e872a8250debe35b2b405169cd43bdbc962c34739bd277e35d8038f3fa166251" Mar 01 09:33:27 crc kubenswrapper[4792]: I0301 09:33:27.647936 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8np4j" Mar 01 09:33:27 crc kubenswrapper[4792]: I0301 09:33:27.658698 4792 scope.go:117] "RemoveContainer" containerID="6dd143b9e9badd592279cca432fe539c49e92a79ca469f608516d1e967d18c73" Mar 01 09:33:27 crc kubenswrapper[4792]: I0301 09:33:27.698181 4792 scope.go:117] "RemoveContainer" containerID="68eabc969b4329c81ee454f5c339af1b09a491b6cf0b1ab092fc279d1ef9e440" Mar 01 09:33:27 crc kubenswrapper[4792]: I0301 09:33:27.735451 4792 scope.go:117] "RemoveContainer" containerID="81c1c2615bd05b6e2f8a23b6d892f8335b3c7a5c117575ce3ed245f2faa7543f" Mar 01 09:33:27 crc kubenswrapper[4792]: I0301 09:33:27.791651 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ade9641-e262-4086-b871-3d010d48a86a-catalog-content\") pod \"6ade9641-e262-4086-b871-3d010d48a86a\" (UID: \"6ade9641-e262-4086-b871-3d010d48a86a\") " Mar 01 09:33:27 crc kubenswrapper[4792]: I0301 09:33:27.791744 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ade9641-e262-4086-b871-3d010d48a86a-utilities\") pod \"6ade9641-e262-4086-b871-3d010d48a86a\" (UID: \"6ade9641-e262-4086-b871-3d010d48a86a\") " Mar 01 09:33:27 crc kubenswrapper[4792]: I0301 09:33:27.791988 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9469s\" (UniqueName: \"kubernetes.io/projected/6ade9641-e262-4086-b871-3d010d48a86a-kube-api-access-9469s\") pod \"6ade9641-e262-4086-b871-3d010d48a86a\" (UID: \"6ade9641-e262-4086-b871-3d010d48a86a\") " Mar 01 09:33:27 crc kubenswrapper[4792]: I0301 09:33:27.793009 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ade9641-e262-4086-b871-3d010d48a86a-utilities" (OuterVolumeSpecName: "utilities") pod "6ade9641-e262-4086-b871-3d010d48a86a" (UID: "6ade9641-e262-4086-b871-3d010d48a86a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:33:27 crc kubenswrapper[4792]: I0301 09:33:27.796748 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ade9641-e262-4086-b871-3d010d48a86a-kube-api-access-9469s" (OuterVolumeSpecName: "kube-api-access-9469s") pod "6ade9641-e262-4086-b871-3d010d48a86a" (UID: "6ade9641-e262-4086-b871-3d010d48a86a"). InnerVolumeSpecName "kube-api-access-9469s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:33:27 crc kubenswrapper[4792]: I0301 09:33:27.894657 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9469s\" (UniqueName: \"kubernetes.io/projected/6ade9641-e262-4086-b871-3d010d48a86a-kube-api-access-9469s\") on node \"crc\" DevicePath \"\"" Mar 01 09:33:27 crc kubenswrapper[4792]: I0301 09:33:27.894701 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ade9641-e262-4086-b871-3d010d48a86a-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:33:27 crc kubenswrapper[4792]: I0301 09:33:27.918726 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ade9641-e262-4086-b871-3d010d48a86a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ade9641-e262-4086-b871-3d010d48a86a" (UID: "6ade9641-e262-4086-b871-3d010d48a86a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:33:27 crc kubenswrapper[4792]: I0301 09:33:27.996985 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ade9641-e262-4086-b871-3d010d48a86a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:33:28 crc kubenswrapper[4792]: I0301 09:33:28.291967 4792 generic.go:334] "Generic (PLEG): container finished" podID="6ade9641-e262-4086-b871-3d010d48a86a" containerID="14aff226ae1ef4a69e0db8d441089bf4ab4d70fd98e04b21c3691c5e2b4594f0" exitCode=0 Mar 01 09:33:28 crc kubenswrapper[4792]: I0301 09:33:28.292029 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8np4j" Mar 01 09:33:28 crc kubenswrapper[4792]: I0301 09:33:28.292017 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8np4j" event={"ID":"6ade9641-e262-4086-b871-3d010d48a86a","Type":"ContainerDied","Data":"14aff226ae1ef4a69e0db8d441089bf4ab4d70fd98e04b21c3691c5e2b4594f0"} Mar 01 09:33:28 crc kubenswrapper[4792]: I0301 09:33:28.292377 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8np4j" event={"ID":"6ade9641-e262-4086-b871-3d010d48a86a","Type":"ContainerDied","Data":"0191adc5316cf18ee8f837f299248bbe54008a3de338797be3e29fa3dacc63bb"} Mar 01 09:33:28 crc kubenswrapper[4792]: I0301 09:33:28.292408 4792 scope.go:117] "RemoveContainer" containerID="14aff226ae1ef4a69e0db8d441089bf4ab4d70fd98e04b21c3691c5e2b4594f0" Mar 01 09:33:28 crc kubenswrapper[4792]: I0301 09:33:28.320134 4792 scope.go:117] "RemoveContainer" containerID="f3ae7ab1c5a82ee5a1b41247907dc695e85724c88c6834ad6eff45f31feeda9e" Mar 01 09:33:28 crc kubenswrapper[4792]: I0301 09:33:28.332309 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8np4j"] Mar 01 09:33:28 crc kubenswrapper[4792]: I0301 09:33:28.341801 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8np4j"] Mar 01 09:33:28 crc kubenswrapper[4792]: I0301 09:33:28.344227 4792 scope.go:117] "RemoveContainer" containerID="80ca295b5ffcae69a52408094affab38679155b5f4a8145c7ff37412f7ceb954" Mar 01 09:33:28 crc kubenswrapper[4792]: I0301 09:33:28.364219 4792 scope.go:117] "RemoveContainer" containerID="14aff226ae1ef4a69e0db8d441089bf4ab4d70fd98e04b21c3691c5e2b4594f0" Mar 01 09:33:28 crc kubenswrapper[4792]: E0301 09:33:28.364704 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14aff226ae1ef4a69e0db8d441089bf4ab4d70fd98e04b21c3691c5e2b4594f0\": container with ID starting with 14aff226ae1ef4a69e0db8d441089bf4ab4d70fd98e04b21c3691c5e2b4594f0 not found: ID does not exist" containerID="14aff226ae1ef4a69e0db8d441089bf4ab4d70fd98e04b21c3691c5e2b4594f0" Mar 01 09:33:28 crc kubenswrapper[4792]: I0301 09:33:28.364733 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14aff226ae1ef4a69e0db8d441089bf4ab4d70fd98e04b21c3691c5e2b4594f0"} err="failed to get container status \"14aff226ae1ef4a69e0db8d441089bf4ab4d70fd98e04b21c3691c5e2b4594f0\": rpc error: code = NotFound desc = could not find container \"14aff226ae1ef4a69e0db8d441089bf4ab4d70fd98e04b21c3691c5e2b4594f0\": container with ID starting with 14aff226ae1ef4a69e0db8d441089bf4ab4d70fd98e04b21c3691c5e2b4594f0 not found: ID does not exist" Mar 01 09:33:28 crc kubenswrapper[4792]: I0301 09:33:28.364754 4792 scope.go:117] "RemoveContainer" containerID="f3ae7ab1c5a82ee5a1b41247907dc695e85724c88c6834ad6eff45f31feeda9e" Mar 01 09:33:28 crc kubenswrapper[4792]: E0301 09:33:28.365034 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3ae7ab1c5a82ee5a1b41247907dc695e85724c88c6834ad6eff45f31feeda9e\": container with ID starting with f3ae7ab1c5a82ee5a1b41247907dc695e85724c88c6834ad6eff45f31feeda9e not found: ID does not exist" containerID="f3ae7ab1c5a82ee5a1b41247907dc695e85724c88c6834ad6eff45f31feeda9e" Mar 01 09:33:28 crc kubenswrapper[4792]: I0301 09:33:28.365060 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3ae7ab1c5a82ee5a1b41247907dc695e85724c88c6834ad6eff45f31feeda9e"} err="failed to get container status \"f3ae7ab1c5a82ee5a1b41247907dc695e85724c88c6834ad6eff45f31feeda9e\": rpc error: code = NotFound desc = could not find container \"f3ae7ab1c5a82ee5a1b41247907dc695e85724c88c6834ad6eff45f31feeda9e\": container with ID starting with f3ae7ab1c5a82ee5a1b41247907dc695e85724c88c6834ad6eff45f31feeda9e not found: ID does not exist" Mar 01 09:33:28 crc kubenswrapper[4792]: I0301 09:33:28.365195 4792 scope.go:117] "RemoveContainer" containerID="80ca295b5ffcae69a52408094affab38679155b5f4a8145c7ff37412f7ceb954" Mar 01 09:33:28 crc kubenswrapper[4792]: E0301 09:33:28.365404 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80ca295b5ffcae69a52408094affab38679155b5f4a8145c7ff37412f7ceb954\": container with ID starting with 80ca295b5ffcae69a52408094affab38679155b5f4a8145c7ff37412f7ceb954 not found: ID does not exist" containerID="80ca295b5ffcae69a52408094affab38679155b5f4a8145c7ff37412f7ceb954" Mar 01 09:33:28 crc kubenswrapper[4792]: I0301 09:33:28.365422 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80ca295b5ffcae69a52408094affab38679155b5f4a8145c7ff37412f7ceb954"} err="failed to get container status \"80ca295b5ffcae69a52408094affab38679155b5f4a8145c7ff37412f7ceb954\": rpc error: code = NotFound desc = could not find container \"80ca295b5ffcae69a52408094affab38679155b5f4a8145c7ff37412f7ceb954\": container with ID starting with 80ca295b5ffcae69a52408094affab38679155b5f4a8145c7ff37412f7ceb954 not found: ID does not exist" Mar 01 09:33:29 crc kubenswrapper[4792]: I0301 09:33:29.418424 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ade9641-e262-4086-b871-3d010d48a86a" path="/var/lib/kubelet/pods/6ade9641-e262-4086-b871-3d010d48a86a/volumes" Mar 01 09:33:34 crc kubenswrapper[4792]: I0301 09:33:34.943509 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:33:34 crc kubenswrapper[4792]: I0301 09:33:34.944062 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:34:00 crc kubenswrapper[4792]: I0301 09:34:00.146588 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539294-5jtlh"] Mar 01 09:34:00 crc kubenswrapper[4792]: E0301 09:34:00.149332 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ade9641-e262-4086-b871-3d010d48a86a" containerName="registry-server" Mar 01 09:34:00 crc kubenswrapper[4792]: I0301 09:34:00.149460 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ade9641-e262-4086-b871-3d010d48a86a" containerName="registry-server" Mar 01 09:34:00 crc kubenswrapper[4792]: E0301 09:34:00.149541 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ade9641-e262-4086-b871-3d010d48a86a" containerName="extract-utilities" Mar 01 09:34:00 crc kubenswrapper[4792]: I0301 09:34:00.149612 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ade9641-e262-4086-b871-3d010d48a86a" containerName="extract-utilities" Mar 01 09:34:00 crc kubenswrapper[4792]: E0301 09:34:00.149701 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ade9641-e262-4086-b871-3d010d48a86a" containerName="extract-content" Mar 01 09:34:00 crc kubenswrapper[4792]: I0301 09:34:00.149778 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ade9641-e262-4086-b871-3d010d48a86a" containerName="extract-content" Mar 01 09:34:00 crc kubenswrapper[4792]: I0301 09:34:00.150110 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ade9641-e262-4086-b871-3d010d48a86a" containerName="registry-server" Mar 01 09:34:00 crc kubenswrapper[4792]: I0301 09:34:00.150960 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539294-5jtlh" Mar 01 09:34:00 crc kubenswrapper[4792]: I0301 09:34:00.153984 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:34:00 crc kubenswrapper[4792]: I0301 09:34:00.154187 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:34:00 crc kubenswrapper[4792]: I0301 09:34:00.154364 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:34:00 crc kubenswrapper[4792]: I0301 09:34:00.157508 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539294-5jtlh"] Mar 01 09:34:00 crc kubenswrapper[4792]: I0301 09:34:00.275916 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmfl9\" (UniqueName: \"kubernetes.io/projected/a6725e35-5100-4360-85ca-00aad33007d4-kube-api-access-qmfl9\") pod \"auto-csr-approver-29539294-5jtlh\" (UID: \"a6725e35-5100-4360-85ca-00aad33007d4\") " pod="openshift-infra/auto-csr-approver-29539294-5jtlh" Mar 01 09:34:00 crc kubenswrapper[4792]: I0301 09:34:00.377731 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmfl9\" (UniqueName: \"kubernetes.io/projected/a6725e35-5100-4360-85ca-00aad33007d4-kube-api-access-qmfl9\") pod \"auto-csr-approver-29539294-5jtlh\" (UID: \"a6725e35-5100-4360-85ca-00aad33007d4\") " pod="openshift-infra/auto-csr-approver-29539294-5jtlh" Mar 01 09:34:00 crc kubenswrapper[4792]: I0301 09:34:00.408830 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmfl9\" (UniqueName: \"kubernetes.io/projected/a6725e35-5100-4360-85ca-00aad33007d4-kube-api-access-qmfl9\") pod \"auto-csr-approver-29539294-5jtlh\" (UID: \"a6725e35-5100-4360-85ca-00aad33007d4\") " pod="openshift-infra/auto-csr-approver-29539294-5jtlh" Mar 01 09:34:00 crc kubenswrapper[4792]: I0301 09:34:00.472554 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539294-5jtlh" Mar 01 09:34:00 crc kubenswrapper[4792]: I0301 09:34:00.922955 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 01 09:34:00 crc kubenswrapper[4792]: I0301 09:34:00.924702 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539294-5jtlh"] Mar 01 09:34:01 crc kubenswrapper[4792]: I0301 09:34:01.561414 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539294-5jtlh" event={"ID":"a6725e35-5100-4360-85ca-00aad33007d4","Type":"ContainerStarted","Data":"63dbef626b111579f2bd4db9775a778114d3a877b3fcc4798e7ba32d50b983f5"} Mar 01 09:34:02 crc kubenswrapper[4792]: I0301 09:34:02.572814 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539294-5jtlh" event={"ID":"a6725e35-5100-4360-85ca-00aad33007d4","Type":"ContainerStarted","Data":"e107bd42472fceec66462b44aaa6f7f47fb07e9eba8ac8e30bec4fee69d4eff3"} Mar 01 09:34:02 crc kubenswrapper[4792]: I0301 09:34:02.594859 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29539294-5jtlh" podStartSLOduration=1.733798539 podStartE2EDuration="2.594823121s" podCreationTimestamp="2026-03-01 09:34:00 +0000 UTC" firstStartedPulling="2026-03-01 09:34:00.922743563 +0000 UTC m=+1570.164622760" lastFinishedPulling="2026-03-01 09:34:01.783768135 +0000 UTC m=+1571.025647342" observedRunningTime="2026-03-01 09:34:02.584456349 +0000 UTC m=+1571.826335536" watchObservedRunningTime="2026-03-01 09:34:02.594823121 +0000 UTC m=+1571.836702318" Mar 01 09:34:03 crc kubenswrapper[4792]: I0301 09:34:03.583627 4792 generic.go:334] "Generic (PLEG): container finished" podID="a6725e35-5100-4360-85ca-00aad33007d4" containerID="e107bd42472fceec66462b44aaa6f7f47fb07e9eba8ac8e30bec4fee69d4eff3" exitCode=0 Mar 01 09:34:03 crc kubenswrapper[4792]: I0301 09:34:03.583674 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539294-5jtlh" event={"ID":"a6725e35-5100-4360-85ca-00aad33007d4","Type":"ContainerDied","Data":"e107bd42472fceec66462b44aaa6f7f47fb07e9eba8ac8e30bec4fee69d4eff3"} Mar 01 09:34:04 crc kubenswrapper[4792]: I0301 09:34:04.921586 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539294-5jtlh" Mar 01 09:34:04 crc kubenswrapper[4792]: I0301 09:34:04.943120 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:34:04 crc kubenswrapper[4792]: I0301 09:34:04.943173 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:34:04 crc kubenswrapper[4792]: I0301 09:34:04.943210 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:34:04 crc kubenswrapper[4792]: I0301 09:34:04.943845 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6c6c336556c3895a23d652801049ab8fd2cc3ff89812dc0c31bb6831441e0e06"} pod="openshift-machine-config-operator/machine-config-daemon-bqszv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 01 09:34:04 crc kubenswrapper[4792]: I0301 09:34:04.943920 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" containerID="cri-o://6c6c336556c3895a23d652801049ab8fd2cc3ff89812dc0c31bb6831441e0e06" gracePeriod=600 Mar 01 09:34:05 crc kubenswrapper[4792]: I0301 09:34:05.069808 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmfl9\" (UniqueName: \"kubernetes.io/projected/a6725e35-5100-4360-85ca-00aad33007d4-kube-api-access-qmfl9\") pod \"a6725e35-5100-4360-85ca-00aad33007d4\" (UID: \"a6725e35-5100-4360-85ca-00aad33007d4\") " Mar 01 09:34:05 crc kubenswrapper[4792]: I0301 09:34:05.075386 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6725e35-5100-4360-85ca-00aad33007d4-kube-api-access-qmfl9" (OuterVolumeSpecName: "kube-api-access-qmfl9") pod "a6725e35-5100-4360-85ca-00aad33007d4" (UID: "a6725e35-5100-4360-85ca-00aad33007d4"). InnerVolumeSpecName "kube-api-access-qmfl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:34:05 crc kubenswrapper[4792]: I0301 09:34:05.172172 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmfl9\" (UniqueName: \"kubernetes.io/projected/a6725e35-5100-4360-85ca-00aad33007d4-kube-api-access-qmfl9\") on node \"crc\" DevicePath \"\"" Mar 01 09:34:05 crc kubenswrapper[4792]: I0301 09:34:05.618066 4792 generic.go:334] "Generic (PLEG): container finished" podID="9105f6b0-6f16-47aa-8009-73736a90b765" containerID="6c6c336556c3895a23d652801049ab8fd2cc3ff89812dc0c31bb6831441e0e06" exitCode=0 Mar 01 09:34:05 crc kubenswrapper[4792]: I0301 09:34:05.618758 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerDied","Data":"6c6c336556c3895a23d652801049ab8fd2cc3ff89812dc0c31bb6831441e0e06"} Mar 01 09:34:05 crc kubenswrapper[4792]: I0301 09:34:05.618875 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e"} Mar 01 09:34:05 crc kubenswrapper[4792]: I0301 09:34:05.618983 4792 scope.go:117] "RemoveContainer" containerID="d3c51b46f8c635f0bd922e6a816c6cbadeb855fe42f4d60474cde44514e4e4de" Mar 01 09:34:05 crc kubenswrapper[4792]: I0301 09:34:05.636556 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539294-5jtlh" event={"ID":"a6725e35-5100-4360-85ca-00aad33007d4","Type":"ContainerDied","Data":"63dbef626b111579f2bd4db9775a778114d3a877b3fcc4798e7ba32d50b983f5"} Mar 01 09:34:05 crc kubenswrapper[4792]: I0301 09:34:05.636593 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63dbef626b111579f2bd4db9775a778114d3a877b3fcc4798e7ba32d50b983f5" Mar 01 09:34:05 crc kubenswrapper[4792]: I0301 09:34:05.636656 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539294-5jtlh" Mar 01 09:34:05 crc kubenswrapper[4792]: I0301 09:34:05.675928 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539288-9klf4"] Mar 01 09:34:05 crc kubenswrapper[4792]: I0301 09:34:05.686562 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539288-9klf4"] Mar 01 09:34:07 crc kubenswrapper[4792]: I0301 09:34:07.419141 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e8f417d-a9b7-4969-9e24-785fa8baf9c4" path="/var/lib/kubelet/pods/2e8f417d-a9b7-4969-9e24-785fa8baf9c4/volumes" Mar 01 09:34:27 crc kubenswrapper[4792]: I0301 09:34:27.824783 4792 scope.go:117] "RemoveContainer" containerID="ac3d91f96c6efaf7baa089ecdf84d5d3fe923f61545b960c7a1aa1d77e8db2e5" Mar 01 09:34:27 crc kubenswrapper[4792]: I0301 09:34:27.844689 4792 scope.go:117] "RemoveContainer" containerID="670f35fecbdbd6b75d71f5beac8b8d230ac85378c92ee4b0813d0d49f8a4dde5" Mar 01 09:34:27 crc kubenswrapper[4792]: I0301 09:34:27.902350 4792 scope.go:117] "RemoveContainer" containerID="0b4398286a53ae92983ef93db19480d6804e4b83a997761fc68f16627e65ecd5" Mar 01 09:34:27 crc kubenswrapper[4792]: I0301 09:34:27.952230 4792 scope.go:117] "RemoveContainer" containerID="4fd86a535781157d736a326c5d3973270ef9e75f90ca5c7a184728477646f601" Mar 01 09:34:59 crc kubenswrapper[4792]: I0301 09:34:59.455256 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wl5zx"] Mar 01 09:34:59 crc kubenswrapper[4792]: E0301 09:34:59.456657 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6725e35-5100-4360-85ca-00aad33007d4" containerName="oc" Mar 01 09:34:59 crc kubenswrapper[4792]: I0301 09:34:59.456676 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6725e35-5100-4360-85ca-00aad33007d4" containerName="oc" Mar 01 09:34:59 crc kubenswrapper[4792]: I0301 09:34:59.456898 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6725e35-5100-4360-85ca-00aad33007d4" containerName="oc" Mar 01 09:34:59 crc kubenswrapper[4792]: I0301 09:34:59.458483 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wl5zx" Mar 01 09:34:59 crc kubenswrapper[4792]: I0301 09:34:59.466960 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wl5zx"] Mar 01 09:34:59 crc kubenswrapper[4792]: I0301 09:34:59.556981 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a89252a8-b40d-4834-b779-de581f79f189-catalog-content\") pod \"redhat-marketplace-wl5zx\" (UID: \"a89252a8-b40d-4834-b779-de581f79f189\") " pod="openshift-marketplace/redhat-marketplace-wl5zx" Mar 01 09:34:59 crc kubenswrapper[4792]: I0301 09:34:59.557091 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a89252a8-b40d-4834-b779-de581f79f189-utilities\") pod \"redhat-marketplace-wl5zx\" (UID: \"a89252a8-b40d-4834-b779-de581f79f189\") " pod="openshift-marketplace/redhat-marketplace-wl5zx" Mar 01 09:34:59 crc kubenswrapper[4792]: I0301 09:34:59.557147 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29rc5\" (UniqueName: \"kubernetes.io/projected/a89252a8-b40d-4834-b779-de581f79f189-kube-api-access-29rc5\") pod \"redhat-marketplace-wl5zx\" (UID: \"a89252a8-b40d-4834-b779-de581f79f189\") " pod="openshift-marketplace/redhat-marketplace-wl5zx" Mar 01 09:34:59 crc kubenswrapper[4792]: I0301 09:34:59.659174 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29rc5\" (UniqueName: \"kubernetes.io/projected/a89252a8-b40d-4834-b779-de581f79f189-kube-api-access-29rc5\") pod \"redhat-marketplace-wl5zx\" (UID: \"a89252a8-b40d-4834-b779-de581f79f189\") " pod="openshift-marketplace/redhat-marketplace-wl5zx" Mar 01 09:34:59 crc kubenswrapper[4792]: I0301 09:34:59.659298 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a89252a8-b40d-4834-b779-de581f79f189-catalog-content\") pod \"redhat-marketplace-wl5zx\" (UID: \"a89252a8-b40d-4834-b779-de581f79f189\") " pod="openshift-marketplace/redhat-marketplace-wl5zx" Mar 01 09:34:59 crc kubenswrapper[4792]: I0301 09:34:59.659452 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a89252a8-b40d-4834-b779-de581f79f189-utilities\") pod \"redhat-marketplace-wl5zx\" (UID: \"a89252a8-b40d-4834-b779-de581f79f189\") " pod="openshift-marketplace/redhat-marketplace-wl5zx" Mar 01 09:34:59 crc kubenswrapper[4792]: I0301 09:34:59.659835 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a89252a8-b40d-4834-b779-de581f79f189-catalog-content\") pod \"redhat-marketplace-wl5zx\" (UID: \"a89252a8-b40d-4834-b779-de581f79f189\") " pod="openshift-marketplace/redhat-marketplace-wl5zx" Mar 01 09:34:59 crc kubenswrapper[4792]: I0301 09:34:59.659935 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a89252a8-b40d-4834-b779-de581f79f189-utilities\") pod \"redhat-marketplace-wl5zx\" (UID: \"a89252a8-b40d-4834-b779-de581f79f189\") " pod="openshift-marketplace/redhat-marketplace-wl5zx" Mar 01 09:34:59 crc kubenswrapper[4792]: I0301 09:34:59.677575 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29rc5\" (UniqueName: \"kubernetes.io/projected/a89252a8-b40d-4834-b779-de581f79f189-kube-api-access-29rc5\") pod \"redhat-marketplace-wl5zx\" (UID: \"a89252a8-b40d-4834-b779-de581f79f189\") " pod="openshift-marketplace/redhat-marketplace-wl5zx" Mar 01 09:34:59 crc kubenswrapper[4792]: I0301 09:34:59.779534 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wl5zx" Mar 01 09:35:00 crc kubenswrapper[4792]: I0301 09:35:00.239050 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wl5zx"] Mar 01 09:35:01 crc kubenswrapper[4792]: I0301 09:35:01.134072 4792 generic.go:334] "Generic (PLEG): container finished" podID="a89252a8-b40d-4834-b779-de581f79f189" containerID="f225bc33f46e6dda33bce0741685921fe0d892a9accebd1197ca53fc9c52e0b0" exitCode=0 Mar 01 09:35:01 crc kubenswrapper[4792]: I0301 09:35:01.134175 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wl5zx" event={"ID":"a89252a8-b40d-4834-b779-de581f79f189","Type":"ContainerDied","Data":"f225bc33f46e6dda33bce0741685921fe0d892a9accebd1197ca53fc9c52e0b0"} Mar 01 09:35:01 crc kubenswrapper[4792]: I0301 09:35:01.134398 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wl5zx" event={"ID":"a89252a8-b40d-4834-b779-de581f79f189","Type":"ContainerStarted","Data":"38095d24331ba4477eb556caad35e6df6322eb44f1fa245ce42a42dc8a30b1ca"} Mar 01 09:35:02 crc kubenswrapper[4792]: I0301 09:35:02.144580 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wl5zx" event={"ID":"a89252a8-b40d-4834-b779-de581f79f189","Type":"ContainerStarted","Data":"f5497cbcc8961e27f62acf3129e4a7486e0def49b4ca62f36cbf0ef4873f6c5c"} Mar 01 09:35:03 crc kubenswrapper[4792]: I0301 09:35:03.154112 4792 generic.go:334] "Generic (PLEG): container finished" podID="a89252a8-b40d-4834-b779-de581f79f189" containerID="f5497cbcc8961e27f62acf3129e4a7486e0def49b4ca62f36cbf0ef4873f6c5c" exitCode=0 Mar 01 09:35:03 crc kubenswrapper[4792]: I0301 09:35:03.154362 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wl5zx" event={"ID":"a89252a8-b40d-4834-b779-de581f79f189","Type":"ContainerDied","Data":"f5497cbcc8961e27f62acf3129e4a7486e0def49b4ca62f36cbf0ef4873f6c5c"} Mar 01 09:35:03 crc kubenswrapper[4792]: I0301 09:35:03.237412 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fcxll"] Mar 01 09:35:03 crc kubenswrapper[4792]: I0301 09:35:03.239418 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fcxll" Mar 01 09:35:03 crc kubenswrapper[4792]: I0301 09:35:03.253660 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fcxll"] Mar 01 09:35:03 crc kubenswrapper[4792]: I0301 09:35:03.330316 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1afe4776-6480-4f35-afcc-a281193262c9-catalog-content\") pod \"community-operators-fcxll\" (UID: \"1afe4776-6480-4f35-afcc-a281193262c9\") " pod="openshift-marketplace/community-operators-fcxll" Mar 01 09:35:03 crc kubenswrapper[4792]: I0301 09:35:03.330377 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6djqm\" (UniqueName: \"kubernetes.io/projected/1afe4776-6480-4f35-afcc-a281193262c9-kube-api-access-6djqm\") pod \"community-operators-fcxll\" (UID: \"1afe4776-6480-4f35-afcc-a281193262c9\") " pod="openshift-marketplace/community-operators-fcxll" Mar 01 09:35:03 crc kubenswrapper[4792]: I0301 09:35:03.330402 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1afe4776-6480-4f35-afcc-a281193262c9-utilities\") pod \"community-operators-fcxll\" (UID: \"1afe4776-6480-4f35-afcc-a281193262c9\") " pod="openshift-marketplace/community-operators-fcxll" Mar 01 09:35:03 crc kubenswrapper[4792]: I0301 09:35:03.432034 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6djqm\" (UniqueName: \"kubernetes.io/projected/1afe4776-6480-4f35-afcc-a281193262c9-kube-api-access-6djqm\") pod \"community-operators-fcxll\" (UID: \"1afe4776-6480-4f35-afcc-a281193262c9\") " pod="openshift-marketplace/community-operators-fcxll" Mar 01 09:35:03 crc kubenswrapper[4792]: I0301 09:35:03.432090 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1afe4776-6480-4f35-afcc-a281193262c9-utilities\") pod \"community-operators-fcxll\" (UID: \"1afe4776-6480-4f35-afcc-a281193262c9\") " pod="openshift-marketplace/community-operators-fcxll" Mar 01 09:35:03 crc kubenswrapper[4792]: I0301 09:35:03.432223 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1afe4776-6480-4f35-afcc-a281193262c9-catalog-content\") pod \"community-operators-fcxll\" (UID: \"1afe4776-6480-4f35-afcc-a281193262c9\") " pod="openshift-marketplace/community-operators-fcxll" Mar 01 09:35:03 crc kubenswrapper[4792]: I0301 09:35:03.432579 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1afe4776-6480-4f35-afcc-a281193262c9-utilities\") pod \"community-operators-fcxll\" (UID: \"1afe4776-6480-4f35-afcc-a281193262c9\") " pod="openshift-marketplace/community-operators-fcxll" Mar 01 09:35:03 crc kubenswrapper[4792]: I0301 09:35:03.432630 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1afe4776-6480-4f35-afcc-a281193262c9-catalog-content\") pod \"community-operators-fcxll\" (UID: \"1afe4776-6480-4f35-afcc-a281193262c9\") " pod="openshift-marketplace/community-operators-fcxll" Mar 01 09:35:03 crc kubenswrapper[4792]: I0301 09:35:03.452929 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6djqm\" (UniqueName: \"kubernetes.io/projected/1afe4776-6480-4f35-afcc-a281193262c9-kube-api-access-6djqm\") pod \"community-operators-fcxll\" (UID: \"1afe4776-6480-4f35-afcc-a281193262c9\") " pod="openshift-marketplace/community-operators-fcxll" Mar 01 09:35:03 crc kubenswrapper[4792]: I0301 09:35:03.556969 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fcxll" Mar 01 09:35:04 crc kubenswrapper[4792]: I0301 09:35:04.061964 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fcxll"] Mar 01 09:35:04 crc kubenswrapper[4792]: I0301 09:35:04.163265 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcxll" event={"ID":"1afe4776-6480-4f35-afcc-a281193262c9","Type":"ContainerStarted","Data":"3a50e5fc5ec68b469488a198251a926259d52265bbc08a456414a2ff134347dd"} Mar 01 09:35:04 crc kubenswrapper[4792]: I0301 09:35:04.165957 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wl5zx" event={"ID":"a89252a8-b40d-4834-b779-de581f79f189","Type":"ContainerStarted","Data":"ba9b476cb3c0778e21f4a93c3bb2db37852633c19ddbf425c623316bd8484f71"} Mar 01 09:35:04 crc kubenswrapper[4792]: I0301 09:35:04.190300 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wl5zx" podStartSLOduration=2.794768243 podStartE2EDuration="5.190275422s" podCreationTimestamp="2026-03-01 09:34:59 +0000 UTC" firstStartedPulling="2026-03-01 09:35:01.135773445 +0000 UTC m=+1630.377652652" lastFinishedPulling="2026-03-01 09:35:03.531280634 +0000 UTC m=+1632.773159831" observedRunningTime="2026-03-01 09:35:04.18487096 +0000 UTC m=+1633.426750167" watchObservedRunningTime="2026-03-01 09:35:04.190275422 +0000 UTC m=+1633.432154619" Mar 01 09:35:05 crc kubenswrapper[4792]: I0301 09:35:05.175185 4792 generic.go:334] "Generic (PLEG): container finished" podID="1afe4776-6480-4f35-afcc-a281193262c9" containerID="388a7a778d56b9d677489e2790140ca00303b0b34ce53763de5976889826c926" exitCode=0 Mar 01 09:35:05 crc kubenswrapper[4792]: I0301 09:35:05.175407 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcxll" event={"ID":"1afe4776-6480-4f35-afcc-a281193262c9","Type":"ContainerDied","Data":"388a7a778d56b9d677489e2790140ca00303b0b34ce53763de5976889826c926"} Mar 01 09:35:06 crc kubenswrapper[4792]: I0301 09:35:06.184414 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcxll" event={"ID":"1afe4776-6480-4f35-afcc-a281193262c9","Type":"ContainerStarted","Data":"28a4a862fd946bea0b146fb4f74f4d0dd37e8c6bdea644885c268ed6f637d7f2"} Mar 01 09:35:09 crc kubenswrapper[4792]: I0301 09:35:09.208370 4792 generic.go:334] "Generic (PLEG): container finished" podID="1afe4776-6480-4f35-afcc-a281193262c9" containerID="28a4a862fd946bea0b146fb4f74f4d0dd37e8c6bdea644885c268ed6f637d7f2" exitCode=0 Mar 01 09:35:09 crc kubenswrapper[4792]: I0301 09:35:09.208446 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcxll" event={"ID":"1afe4776-6480-4f35-afcc-a281193262c9","Type":"ContainerDied","Data":"28a4a862fd946bea0b146fb4f74f4d0dd37e8c6bdea644885c268ed6f637d7f2"} Mar 01 09:35:09 crc kubenswrapper[4792]: I0301 09:35:09.780396 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wl5zx" Mar 01 09:35:09 crc kubenswrapper[4792]: I0301 09:35:09.780733 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wl5zx" Mar 01 09:35:09 crc kubenswrapper[4792]: I0301 09:35:09.827728 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wl5zx" Mar 01 09:35:10 crc kubenswrapper[4792]: I0301 09:35:10.220261 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcxll" event={"ID":"1afe4776-6480-4f35-afcc-a281193262c9","Type":"ContainerStarted","Data":"5ae47c4e6417d04fae7af9210d91c331587934aecae33a0c298b593503918666"} Mar 01 09:35:10 crc kubenswrapper[4792]: I0301 09:35:10.253345 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fcxll" podStartSLOduration=2.840584367 podStartE2EDuration="7.253324624s" podCreationTimestamp="2026-03-01 09:35:03 +0000 UTC" firstStartedPulling="2026-03-01 09:35:05.177747295 +0000 UTC m=+1634.419626492" lastFinishedPulling="2026-03-01 09:35:09.590487552 +0000 UTC m=+1638.832366749" observedRunningTime="2026-03-01 09:35:10.245309408 +0000 UTC m=+1639.487188605" watchObservedRunningTime="2026-03-01 09:35:10.253324624 +0000 UTC m=+1639.495203821" Mar 01 09:35:10 crc kubenswrapper[4792]: I0301 09:35:10.273670 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wl5zx" Mar 01 09:35:12 crc kubenswrapper[4792]: I0301 09:35:12.029621 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wl5zx"] Mar 01 09:35:12 crc kubenswrapper[4792]: I0301 09:35:12.235666 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wl5zx" podUID="a89252a8-b40d-4834-b779-de581f79f189" containerName="registry-server" containerID="cri-o://ba9b476cb3c0778e21f4a93c3bb2db37852633c19ddbf425c623316bd8484f71" gracePeriod=2 Mar 01 09:35:12 crc kubenswrapper[4792]: I0301 09:35:12.721250 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wl5zx" Mar 01 09:35:12 crc kubenswrapper[4792]: I0301 09:35:12.808268 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a89252a8-b40d-4834-b779-de581f79f189-catalog-content\") pod \"a89252a8-b40d-4834-b779-de581f79f189\" (UID: \"a89252a8-b40d-4834-b779-de581f79f189\") " Mar 01 09:35:12 crc kubenswrapper[4792]: I0301 09:35:12.808310 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29rc5\" (UniqueName: \"kubernetes.io/projected/a89252a8-b40d-4834-b779-de581f79f189-kube-api-access-29rc5\") pod \"a89252a8-b40d-4834-b779-de581f79f189\" (UID: \"a89252a8-b40d-4834-b779-de581f79f189\") " Mar 01 09:35:12 crc kubenswrapper[4792]: I0301 09:35:12.808341 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a89252a8-b40d-4834-b779-de581f79f189-utilities\") pod \"a89252a8-b40d-4834-b779-de581f79f189\" (UID: \"a89252a8-b40d-4834-b779-de581f79f189\") " Mar 01 09:35:12 crc kubenswrapper[4792]: I0301 09:35:12.809545 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a89252a8-b40d-4834-b779-de581f79f189-utilities" (OuterVolumeSpecName: "utilities") pod "a89252a8-b40d-4834-b779-de581f79f189" (UID: "a89252a8-b40d-4834-b779-de581f79f189"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:35:12 crc kubenswrapper[4792]: I0301 09:35:12.810160 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a89252a8-b40d-4834-b779-de581f79f189-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:35:12 crc kubenswrapper[4792]: I0301 09:35:12.814259 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a89252a8-b40d-4834-b779-de581f79f189-kube-api-access-29rc5" (OuterVolumeSpecName: "kube-api-access-29rc5") pod "a89252a8-b40d-4834-b779-de581f79f189" (UID: "a89252a8-b40d-4834-b779-de581f79f189"). InnerVolumeSpecName "kube-api-access-29rc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:35:12 crc kubenswrapper[4792]: I0301 09:35:12.836093 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a89252a8-b40d-4834-b779-de581f79f189-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a89252a8-b40d-4834-b779-de581f79f189" (UID: "a89252a8-b40d-4834-b779-de581f79f189"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:35:12 crc kubenswrapper[4792]: I0301 09:35:12.912332 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a89252a8-b40d-4834-b779-de581f79f189-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:35:12 crc kubenswrapper[4792]: I0301 09:35:12.912759 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29rc5\" (UniqueName: \"kubernetes.io/projected/a89252a8-b40d-4834-b779-de581f79f189-kube-api-access-29rc5\") on node \"crc\" DevicePath \"\"" Mar 01 09:35:13 crc kubenswrapper[4792]: I0301 09:35:13.245200 4792 generic.go:334] "Generic (PLEG): container finished" podID="a89252a8-b40d-4834-b779-de581f79f189" containerID="ba9b476cb3c0778e21f4a93c3bb2db37852633c19ddbf425c623316bd8484f71" exitCode=0 Mar 01 09:35:13 crc kubenswrapper[4792]: I0301 09:35:13.245261 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wl5zx" event={"ID":"a89252a8-b40d-4834-b779-de581f79f189","Type":"ContainerDied","Data":"ba9b476cb3c0778e21f4a93c3bb2db37852633c19ddbf425c623316bd8484f71"} Mar 01 09:35:13 crc kubenswrapper[4792]: I0301 09:35:13.245295 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wl5zx" Mar 01 09:35:13 crc kubenswrapper[4792]: I0301 09:35:13.245312 4792 scope.go:117] "RemoveContainer" containerID="ba9b476cb3c0778e21f4a93c3bb2db37852633c19ddbf425c623316bd8484f71" Mar 01 09:35:13 crc kubenswrapper[4792]: I0301 09:35:13.245298 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wl5zx" event={"ID":"a89252a8-b40d-4834-b779-de581f79f189","Type":"ContainerDied","Data":"38095d24331ba4477eb556caad35e6df6322eb44f1fa245ce42a42dc8a30b1ca"} Mar 01 09:35:13 crc kubenswrapper[4792]: I0301 09:35:13.273670 4792 scope.go:117] "RemoveContainer" containerID="f5497cbcc8961e27f62acf3129e4a7486e0def49b4ca62f36cbf0ef4873f6c5c" Mar 01 09:35:13 crc kubenswrapper[4792]: I0301 09:35:13.279987 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wl5zx"] Mar 01 09:35:13 crc kubenswrapper[4792]: I0301 09:35:13.297030 4792 scope.go:117] "RemoveContainer" containerID="f225bc33f46e6dda33bce0741685921fe0d892a9accebd1197ca53fc9c52e0b0" Mar 01 09:35:13 crc kubenswrapper[4792]: I0301 09:35:13.308775 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wl5zx"] Mar 01 09:35:13 crc kubenswrapper[4792]: I0301 09:35:13.337105 4792 scope.go:117] "RemoveContainer" containerID="ba9b476cb3c0778e21f4a93c3bb2db37852633c19ddbf425c623316bd8484f71" Mar 01 09:35:13 crc kubenswrapper[4792]: E0301 09:35:13.337983 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba9b476cb3c0778e21f4a93c3bb2db37852633c19ddbf425c623316bd8484f71\": container with ID starting with ba9b476cb3c0778e21f4a93c3bb2db37852633c19ddbf425c623316bd8484f71 not found: ID does not exist" containerID="ba9b476cb3c0778e21f4a93c3bb2db37852633c19ddbf425c623316bd8484f71" Mar 01 09:35:13 crc kubenswrapper[4792]: I0301 09:35:13.338028 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba9b476cb3c0778e21f4a93c3bb2db37852633c19ddbf425c623316bd8484f71"} err="failed to get container status \"ba9b476cb3c0778e21f4a93c3bb2db37852633c19ddbf425c623316bd8484f71\": rpc error: code = NotFound desc = could not find container \"ba9b476cb3c0778e21f4a93c3bb2db37852633c19ddbf425c623316bd8484f71\": container with ID starting with ba9b476cb3c0778e21f4a93c3bb2db37852633c19ddbf425c623316bd8484f71 not found: ID does not exist" Mar 01 09:35:13 crc kubenswrapper[4792]: I0301 09:35:13.338051 4792 scope.go:117] "RemoveContainer" containerID="f5497cbcc8961e27f62acf3129e4a7486e0def49b4ca62f36cbf0ef4873f6c5c" Mar 01 09:35:13 crc kubenswrapper[4792]: E0301 09:35:13.338362 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5497cbcc8961e27f62acf3129e4a7486e0def49b4ca62f36cbf0ef4873f6c5c\": container with ID starting with f5497cbcc8961e27f62acf3129e4a7486e0def49b4ca62f36cbf0ef4873f6c5c not found: ID does not exist" containerID="f5497cbcc8961e27f62acf3129e4a7486e0def49b4ca62f36cbf0ef4873f6c5c" Mar 01 09:35:13 crc kubenswrapper[4792]: I0301 09:35:13.338391 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5497cbcc8961e27f62acf3129e4a7486e0def49b4ca62f36cbf0ef4873f6c5c"} err="failed to get container status \"f5497cbcc8961e27f62acf3129e4a7486e0def49b4ca62f36cbf0ef4873f6c5c\": rpc error: code = NotFound desc = could not find container \"f5497cbcc8961e27f62acf3129e4a7486e0def49b4ca62f36cbf0ef4873f6c5c\": container with ID starting with f5497cbcc8961e27f62acf3129e4a7486e0def49b4ca62f36cbf0ef4873f6c5c not found: ID does not exist" Mar 01 09:35:13 crc kubenswrapper[4792]: I0301 09:35:13.338407 4792 scope.go:117] "RemoveContainer" containerID="f225bc33f46e6dda33bce0741685921fe0d892a9accebd1197ca53fc9c52e0b0" Mar 01 09:35:13 crc kubenswrapper[4792]: E0301 09:35:13.338729 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f225bc33f46e6dda33bce0741685921fe0d892a9accebd1197ca53fc9c52e0b0\": container with ID starting with f225bc33f46e6dda33bce0741685921fe0d892a9accebd1197ca53fc9c52e0b0 not found: ID does not exist" containerID="f225bc33f46e6dda33bce0741685921fe0d892a9accebd1197ca53fc9c52e0b0" Mar 01 09:35:13 crc kubenswrapper[4792]: I0301 09:35:13.338784 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f225bc33f46e6dda33bce0741685921fe0d892a9accebd1197ca53fc9c52e0b0"} err="failed to get container status \"f225bc33f46e6dda33bce0741685921fe0d892a9accebd1197ca53fc9c52e0b0\": rpc error: code = NotFound desc = could not find container \"f225bc33f46e6dda33bce0741685921fe0d892a9accebd1197ca53fc9c52e0b0\": container with ID starting with f225bc33f46e6dda33bce0741685921fe0d892a9accebd1197ca53fc9c52e0b0 not found: ID does not exist" Mar 01 09:35:13 crc kubenswrapper[4792]: I0301 09:35:13.420857 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a89252a8-b40d-4834-b779-de581f79f189" path="/var/lib/kubelet/pods/a89252a8-b40d-4834-b779-de581f79f189/volumes" Mar 01 09:35:13 crc kubenswrapper[4792]: I0301 09:35:13.557421 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fcxll" Mar 01 09:35:13 crc kubenswrapper[4792]: I0301 09:35:13.557504 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fcxll" Mar 01 09:35:14 crc kubenswrapper[4792]: I0301 09:35:14.603038 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-fcxll" podUID="1afe4776-6480-4f35-afcc-a281193262c9" containerName="registry-server" probeResult="failure" output=< Mar 01 09:35:14 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 09:35:14 crc kubenswrapper[4792]: > Mar 01 09:35:23 crc kubenswrapper[4792]: I0301 09:35:23.609311 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fcxll" Mar 01 09:35:23 crc kubenswrapper[4792]: I0301 09:35:23.665022 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fcxll" Mar 01 09:35:23 crc kubenswrapper[4792]: I0301 09:35:23.851509 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fcxll"] Mar 01 09:35:25 crc kubenswrapper[4792]: I0301 09:35:25.340167 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fcxll" podUID="1afe4776-6480-4f35-afcc-a281193262c9" containerName="registry-server" containerID="cri-o://5ae47c4e6417d04fae7af9210d91c331587934aecae33a0c298b593503918666" gracePeriod=2 Mar 01 09:35:25 crc kubenswrapper[4792]: I0301 09:35:25.810687 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fcxll" Mar 01 09:35:25 crc kubenswrapper[4792]: I0301 09:35:25.951785 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1afe4776-6480-4f35-afcc-a281193262c9-utilities\") pod \"1afe4776-6480-4f35-afcc-a281193262c9\" (UID: \"1afe4776-6480-4f35-afcc-a281193262c9\") " Mar 01 09:35:25 crc kubenswrapper[4792]: I0301 09:35:25.951838 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6djqm\" (UniqueName: \"kubernetes.io/projected/1afe4776-6480-4f35-afcc-a281193262c9-kube-api-access-6djqm\") pod \"1afe4776-6480-4f35-afcc-a281193262c9\" (UID: \"1afe4776-6480-4f35-afcc-a281193262c9\") " Mar 01 09:35:25 crc kubenswrapper[4792]: I0301 09:35:25.952122 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1afe4776-6480-4f35-afcc-a281193262c9-catalog-content\") pod \"1afe4776-6480-4f35-afcc-a281193262c9\" (UID: \"1afe4776-6480-4f35-afcc-a281193262c9\") " Mar 01 09:35:25 crc kubenswrapper[4792]: I0301 09:35:25.953898 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1afe4776-6480-4f35-afcc-a281193262c9-utilities" (OuterVolumeSpecName: "utilities") pod "1afe4776-6480-4f35-afcc-a281193262c9" (UID: "1afe4776-6480-4f35-afcc-a281193262c9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:35:25 crc kubenswrapper[4792]: I0301 09:35:25.960793 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1afe4776-6480-4f35-afcc-a281193262c9-kube-api-access-6djqm" (OuterVolumeSpecName: "kube-api-access-6djqm") pod "1afe4776-6480-4f35-afcc-a281193262c9" (UID: "1afe4776-6480-4f35-afcc-a281193262c9"). InnerVolumeSpecName "kube-api-access-6djqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.017649 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1afe4776-6480-4f35-afcc-a281193262c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1afe4776-6480-4f35-afcc-a281193262c9" (UID: "1afe4776-6480-4f35-afcc-a281193262c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.054775 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1afe4776-6480-4f35-afcc-a281193262c9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.054811 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1afe4776-6480-4f35-afcc-a281193262c9-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.054827 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6djqm\" (UniqueName: \"kubernetes.io/projected/1afe4776-6480-4f35-afcc-a281193262c9-kube-api-access-6djqm\") on node \"crc\" DevicePath \"\"" Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.353847 4792 generic.go:334] "Generic (PLEG): container finished" podID="1afe4776-6480-4f35-afcc-a281193262c9" containerID="5ae47c4e6417d04fae7af9210d91c331587934aecae33a0c298b593503918666" exitCode=0 Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.353981 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcxll" event={"ID":"1afe4776-6480-4f35-afcc-a281193262c9","Type":"ContainerDied","Data":"5ae47c4e6417d04fae7af9210d91c331587934aecae33a0c298b593503918666"} Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.354021 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcxll" event={"ID":"1afe4776-6480-4f35-afcc-a281193262c9","Type":"ContainerDied","Data":"3a50e5fc5ec68b469488a198251a926259d52265bbc08a456414a2ff134347dd"} Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.354050 4792 scope.go:117] "RemoveContainer" containerID="5ae47c4e6417d04fae7af9210d91c331587934aecae33a0c298b593503918666" Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.354204 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fcxll" Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.376023 4792 scope.go:117] "RemoveContainer" containerID="28a4a862fd946bea0b146fb4f74f4d0dd37e8c6bdea644885c268ed6f637d7f2" Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.404570 4792 scope.go:117] "RemoveContainer" containerID="388a7a778d56b9d677489e2790140ca00303b0b34ce53763de5976889826c926" Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.421818 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fcxll"] Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.431414 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fcxll"] Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.458289 4792 scope.go:117] "RemoveContainer" containerID="5ae47c4e6417d04fae7af9210d91c331587934aecae33a0c298b593503918666" Mar 01 09:35:26 crc kubenswrapper[4792]: E0301 09:35:26.460403 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ae47c4e6417d04fae7af9210d91c331587934aecae33a0c298b593503918666\": container with ID starting with 5ae47c4e6417d04fae7af9210d91c331587934aecae33a0c298b593503918666 not found: ID does not exist" containerID="5ae47c4e6417d04fae7af9210d91c331587934aecae33a0c298b593503918666" Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.460455 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ae47c4e6417d04fae7af9210d91c331587934aecae33a0c298b593503918666"} err="failed to get container status \"5ae47c4e6417d04fae7af9210d91c331587934aecae33a0c298b593503918666\": rpc error: code = NotFound desc = could not find container \"5ae47c4e6417d04fae7af9210d91c331587934aecae33a0c298b593503918666\": container with ID starting with 5ae47c4e6417d04fae7af9210d91c331587934aecae33a0c298b593503918666 not found: ID does not exist" Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.460487 4792 scope.go:117] "RemoveContainer" containerID="28a4a862fd946bea0b146fb4f74f4d0dd37e8c6bdea644885c268ed6f637d7f2" Mar 01 09:35:26 crc kubenswrapper[4792]: E0301 09:35:26.460829 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28a4a862fd946bea0b146fb4f74f4d0dd37e8c6bdea644885c268ed6f637d7f2\": container with ID starting with 28a4a862fd946bea0b146fb4f74f4d0dd37e8c6bdea644885c268ed6f637d7f2 not found: ID does not exist" containerID="28a4a862fd946bea0b146fb4f74f4d0dd37e8c6bdea644885c268ed6f637d7f2" Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.460929 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28a4a862fd946bea0b146fb4f74f4d0dd37e8c6bdea644885c268ed6f637d7f2"} err="failed to get container status \"28a4a862fd946bea0b146fb4f74f4d0dd37e8c6bdea644885c268ed6f637d7f2\": rpc error: code = NotFound desc = could not find container \"28a4a862fd946bea0b146fb4f74f4d0dd37e8c6bdea644885c268ed6f637d7f2\": container with ID starting with 28a4a862fd946bea0b146fb4f74f4d0dd37e8c6bdea644885c268ed6f637d7f2 not found: ID does not exist" Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.461042 4792 scope.go:117] "RemoveContainer" containerID="388a7a778d56b9d677489e2790140ca00303b0b34ce53763de5976889826c926" Mar 01 09:35:26 crc kubenswrapper[4792]: E0301 09:35:26.461511 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"388a7a778d56b9d677489e2790140ca00303b0b34ce53763de5976889826c926\": container with ID starting with 388a7a778d56b9d677489e2790140ca00303b0b34ce53763de5976889826c926 not found: ID does not exist" containerID="388a7a778d56b9d677489e2790140ca00303b0b34ce53763de5976889826c926" Mar 01 09:35:26 crc kubenswrapper[4792]: I0301 09:35:26.461544 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"388a7a778d56b9d677489e2790140ca00303b0b34ce53763de5976889826c926"} err="failed to get container status \"388a7a778d56b9d677489e2790140ca00303b0b34ce53763de5976889826c926\": rpc error: code = NotFound desc = could not find container \"388a7a778d56b9d677489e2790140ca00303b0b34ce53763de5976889826c926\": container with ID starting with 388a7a778d56b9d677489e2790140ca00303b0b34ce53763de5976889826c926 not found: ID does not exist" Mar 01 09:35:27 crc kubenswrapper[4792]: I0301 09:35:27.418470 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1afe4776-6480-4f35-afcc-a281193262c9" path="/var/lib/kubelet/pods/1afe4776-6480-4f35-afcc-a281193262c9/volumes" Mar 01 09:35:28 crc kubenswrapper[4792]: I0301 09:35:28.049995 4792 scope.go:117] "RemoveContainer" containerID="10b8b301db81f96185e1ca933d6d257371da7cfb2a533a1434c80a6ff2a5895f" Mar 01 09:35:28 crc kubenswrapper[4792]: I0301 09:35:28.073955 4792 scope.go:117] "RemoveContainer" containerID="131d1c48281fda07ee509861ecd19ed50a8dc2c67c40d98a6892403dc5e2415a" Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.143930 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539296-87tnt"] Mar 01 09:36:00 crc kubenswrapper[4792]: E0301 09:36:00.144753 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a89252a8-b40d-4834-b779-de581f79f189" containerName="extract-utilities" Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.144766 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a89252a8-b40d-4834-b779-de581f79f189" containerName="extract-utilities" Mar 01 09:36:00 crc kubenswrapper[4792]: E0301 09:36:00.144796 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1afe4776-6480-4f35-afcc-a281193262c9" containerName="registry-server" Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.144802 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1afe4776-6480-4f35-afcc-a281193262c9" containerName="registry-server" Mar 01 09:36:00 crc kubenswrapper[4792]: E0301 09:36:00.144809 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1afe4776-6480-4f35-afcc-a281193262c9" containerName="extract-utilities" Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.144816 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1afe4776-6480-4f35-afcc-a281193262c9" containerName="extract-utilities" Mar 01 09:36:00 crc kubenswrapper[4792]: E0301 09:36:00.144826 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a89252a8-b40d-4834-b779-de581f79f189" containerName="registry-server" Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.144831 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a89252a8-b40d-4834-b779-de581f79f189" containerName="registry-server" Mar 01 09:36:00 crc kubenswrapper[4792]: E0301 09:36:00.144845 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a89252a8-b40d-4834-b779-de581f79f189" containerName="extract-content" Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.144850 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a89252a8-b40d-4834-b779-de581f79f189" containerName="extract-content" Mar 01 09:36:00 crc kubenswrapper[4792]: E0301 09:36:00.144864 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1afe4776-6480-4f35-afcc-a281193262c9" containerName="extract-content" Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.144869 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1afe4776-6480-4f35-afcc-a281193262c9" containerName="extract-content" Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.145118 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1afe4776-6480-4f35-afcc-a281193262c9" containerName="registry-server" Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.145132 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a89252a8-b40d-4834-b779-de581f79f189" containerName="registry-server" Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.145765 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539296-87tnt" Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.149350 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.151142 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.152467 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.160560 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539296-87tnt"] Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.265444 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv9f9\" (UniqueName: \"kubernetes.io/projected/f0029741-30a3-4fc2-b71d-c77dbd652c35-kube-api-access-wv9f9\") pod \"auto-csr-approver-29539296-87tnt\" (UID: \"f0029741-30a3-4fc2-b71d-c77dbd652c35\") " pod="openshift-infra/auto-csr-approver-29539296-87tnt" Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.370892 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv9f9\" (UniqueName: \"kubernetes.io/projected/f0029741-30a3-4fc2-b71d-c77dbd652c35-kube-api-access-wv9f9\") pod \"auto-csr-approver-29539296-87tnt\" (UID: \"f0029741-30a3-4fc2-b71d-c77dbd652c35\") " pod="openshift-infra/auto-csr-approver-29539296-87tnt" Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.392674 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv9f9\" (UniqueName: \"kubernetes.io/projected/f0029741-30a3-4fc2-b71d-c77dbd652c35-kube-api-access-wv9f9\") pod \"auto-csr-approver-29539296-87tnt\" (UID: \"f0029741-30a3-4fc2-b71d-c77dbd652c35\") " pod="openshift-infra/auto-csr-approver-29539296-87tnt" Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.464239 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539296-87tnt" Mar 01 09:36:00 crc kubenswrapper[4792]: I0301 09:36:00.926684 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539296-87tnt"] Mar 01 09:36:01 crc kubenswrapper[4792]: I0301 09:36:01.136386 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539296-87tnt" event={"ID":"f0029741-30a3-4fc2-b71d-c77dbd652c35","Type":"ContainerStarted","Data":"0d3079eb6446bd098d6ec2c9ffe0811e97aa3c5bc35e8c3c9bc098850cbba915"} Mar 01 09:36:02 crc kubenswrapper[4792]: I0301 09:36:02.145543 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539296-87tnt" event={"ID":"f0029741-30a3-4fc2-b71d-c77dbd652c35","Type":"ContainerStarted","Data":"8b912a3a88a6f648dd530babff742abe47a9567e05895bab6379ed09d8bc8a56"} Mar 01 09:36:03 crc kubenswrapper[4792]: I0301 09:36:03.156155 4792 generic.go:334] "Generic (PLEG): container finished" podID="f0029741-30a3-4fc2-b71d-c77dbd652c35" containerID="8b912a3a88a6f648dd530babff742abe47a9567e05895bab6379ed09d8bc8a56" exitCode=0 Mar 01 09:36:03 crc kubenswrapper[4792]: I0301 09:36:03.156217 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539296-87tnt" event={"ID":"f0029741-30a3-4fc2-b71d-c77dbd652c35","Type":"ContainerDied","Data":"8b912a3a88a6f648dd530babff742abe47a9567e05895bab6379ed09d8bc8a56"} Mar 01 09:36:04 crc kubenswrapper[4792]: I0301 09:36:04.095103 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cbpls"] Mar 01 09:36:04 crc kubenswrapper[4792]: I0301 09:36:04.101541 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cbpls" Mar 01 09:36:04 crc kubenswrapper[4792]: I0301 09:36:04.107837 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cbpls"] Mar 01 09:36:04 crc kubenswrapper[4792]: I0301 09:36:04.236493 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5vt4\" (UniqueName: \"kubernetes.io/projected/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c-kube-api-access-t5vt4\") pod \"certified-operators-cbpls\" (UID: \"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c\") " pod="openshift-marketplace/certified-operators-cbpls" Mar 01 09:36:04 crc kubenswrapper[4792]: I0301 09:36:04.236844 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c-catalog-content\") pod \"certified-operators-cbpls\" (UID: \"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c\") " pod="openshift-marketplace/certified-operators-cbpls" Mar 01 09:36:04 crc kubenswrapper[4792]: I0301 09:36:04.236882 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c-utilities\") pod \"certified-operators-cbpls\" (UID: \"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c\") " pod="openshift-marketplace/certified-operators-cbpls" Mar 01 09:36:04 crc kubenswrapper[4792]: I0301 09:36:04.339114 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c-catalog-content\") pod \"certified-operators-cbpls\" (UID: \"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c\") " pod="openshift-marketplace/certified-operators-cbpls" Mar 01 09:36:04 crc kubenswrapper[4792]: I0301 09:36:04.339198 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c-utilities\") pod \"certified-operators-cbpls\" (UID: \"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c\") " pod="openshift-marketplace/certified-operators-cbpls" Mar 01 09:36:04 crc kubenswrapper[4792]: I0301 09:36:04.339324 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5vt4\" (UniqueName: \"kubernetes.io/projected/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c-kube-api-access-t5vt4\") pod \"certified-operators-cbpls\" (UID: \"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c\") " pod="openshift-marketplace/certified-operators-cbpls" Mar 01 09:36:04 crc kubenswrapper[4792]: I0301 09:36:04.339664 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c-catalog-content\") pod \"certified-operators-cbpls\" (UID: \"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c\") " pod="openshift-marketplace/certified-operators-cbpls" Mar 01 09:36:04 crc kubenswrapper[4792]: I0301 09:36:04.339896 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c-utilities\") pod \"certified-operators-cbpls\" (UID: \"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c\") " pod="openshift-marketplace/certified-operators-cbpls" Mar 01 09:36:04 crc kubenswrapper[4792]: I0301 09:36:04.362777 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5vt4\" (UniqueName: \"kubernetes.io/projected/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c-kube-api-access-t5vt4\") pod \"certified-operators-cbpls\" (UID: \"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c\") " pod="openshift-marketplace/certified-operators-cbpls" Mar 01 09:36:04 crc kubenswrapper[4792]: I0301 09:36:04.424299 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cbpls" Mar 01 09:36:04 crc kubenswrapper[4792]: I0301 09:36:04.552306 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539296-87tnt" Mar 01 09:36:04 crc kubenswrapper[4792]: I0301 09:36:04.644467 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wv9f9\" (UniqueName: \"kubernetes.io/projected/f0029741-30a3-4fc2-b71d-c77dbd652c35-kube-api-access-wv9f9\") pod \"f0029741-30a3-4fc2-b71d-c77dbd652c35\" (UID: \"f0029741-30a3-4fc2-b71d-c77dbd652c35\") " Mar 01 09:36:04 crc kubenswrapper[4792]: I0301 09:36:04.661428 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0029741-30a3-4fc2-b71d-c77dbd652c35-kube-api-access-wv9f9" (OuterVolumeSpecName: "kube-api-access-wv9f9") pod "f0029741-30a3-4fc2-b71d-c77dbd652c35" (UID: "f0029741-30a3-4fc2-b71d-c77dbd652c35"). InnerVolumeSpecName "kube-api-access-wv9f9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:36:04 crc kubenswrapper[4792]: I0301 09:36:04.746492 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wv9f9\" (UniqueName: \"kubernetes.io/projected/f0029741-30a3-4fc2-b71d-c77dbd652c35-kube-api-access-wv9f9\") on node \"crc\" DevicePath \"\"" Mar 01 09:36:04 crc kubenswrapper[4792]: I0301 09:36:04.895390 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cbpls"] Mar 01 09:36:04 crc kubenswrapper[4792]: W0301 09:36:04.904130 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce3dfbd4_8d64_4be6_b93d_a3e300c4ed6c.slice/crio-5d7b7f5ad9ee24fe834e1813c98805aeb552492372f5eec1abf81f2121a73b86 WatchSource:0}: Error finding container 5d7b7f5ad9ee24fe834e1813c98805aeb552492372f5eec1abf81f2121a73b86: Status 404 returned error can't find the container with id 5d7b7f5ad9ee24fe834e1813c98805aeb552492372f5eec1abf81f2121a73b86 Mar 01 09:36:05 crc kubenswrapper[4792]: I0301 09:36:05.173670 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539296-87tnt" event={"ID":"f0029741-30a3-4fc2-b71d-c77dbd652c35","Type":"ContainerDied","Data":"0d3079eb6446bd098d6ec2c9ffe0811e97aa3c5bc35e8c3c9bc098850cbba915"} Mar 01 09:36:05 crc kubenswrapper[4792]: I0301 09:36:05.173955 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d3079eb6446bd098d6ec2c9ffe0811e97aa3c5bc35e8c3c9bc098850cbba915" Mar 01 09:36:05 crc kubenswrapper[4792]: I0301 09:36:05.173745 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539296-87tnt" Mar 01 09:36:05 crc kubenswrapper[4792]: I0301 09:36:05.176842 4792 generic.go:334] "Generic (PLEG): container finished" podID="4a742181-aebe-42f8-a83e-fee7b480366b" containerID="1fcee8427ea6340db8e69cb0e43a52de1fe2f18dc84e960d49fc0b0918052c29" exitCode=0 Mar 01 09:36:05 crc kubenswrapper[4792]: I0301 09:36:05.176939 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" event={"ID":"4a742181-aebe-42f8-a83e-fee7b480366b","Type":"ContainerDied","Data":"1fcee8427ea6340db8e69cb0e43a52de1fe2f18dc84e960d49fc0b0918052c29"} Mar 01 09:36:05 crc kubenswrapper[4792]: I0301 09:36:05.181592 4792 generic.go:334] "Generic (PLEG): container finished" podID="ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c" containerID="9c45ce853bc6b1ad8201ac31229a189b5789c0d6a2beb07af331ffabce6af29e" exitCode=0 Mar 01 09:36:05 crc kubenswrapper[4792]: I0301 09:36:05.181633 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cbpls" event={"ID":"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c","Type":"ContainerDied","Data":"9c45ce853bc6b1ad8201ac31229a189b5789c0d6a2beb07af331ffabce6af29e"} Mar 01 09:36:05 crc kubenswrapper[4792]: I0301 09:36:05.181661 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cbpls" event={"ID":"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c","Type":"ContainerStarted","Data":"5d7b7f5ad9ee24fe834e1813c98805aeb552492372f5eec1abf81f2121a73b86"} Mar 01 09:36:05 crc kubenswrapper[4792]: I0301 09:36:05.613254 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539290-npvbb"] Mar 01 09:36:05 crc kubenswrapper[4792]: I0301 09:36:05.622192 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539290-npvbb"] Mar 01 09:36:06 crc kubenswrapper[4792]: I0301 09:36:06.197320 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cbpls" event={"ID":"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c","Type":"ContainerStarted","Data":"5e7498bd3071920c8c9e21cfd9c176f5c0382b50521ec3e8a9865375f48f1d83"} Mar 01 09:36:06 crc kubenswrapper[4792]: I0301 09:36:06.683504 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" Mar 01 09:36:06 crc kubenswrapper[4792]: I0301 09:36:06.780641 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a742181-aebe-42f8-a83e-fee7b480366b-bootstrap-combined-ca-bundle\") pod \"4a742181-aebe-42f8-a83e-fee7b480366b\" (UID: \"4a742181-aebe-42f8-a83e-fee7b480366b\") " Mar 01 09:36:06 crc kubenswrapper[4792]: I0301 09:36:06.780862 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a742181-aebe-42f8-a83e-fee7b480366b-inventory\") pod \"4a742181-aebe-42f8-a83e-fee7b480366b\" (UID: \"4a742181-aebe-42f8-a83e-fee7b480366b\") " Mar 01 09:36:06 crc kubenswrapper[4792]: I0301 09:36:06.780933 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4a742181-aebe-42f8-a83e-fee7b480366b-ssh-key-openstack-edpm-ipam\") pod \"4a742181-aebe-42f8-a83e-fee7b480366b\" (UID: \"4a742181-aebe-42f8-a83e-fee7b480366b\") " Mar 01 09:36:06 crc kubenswrapper[4792]: I0301 09:36:06.780971 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlgpb\" (UniqueName: \"kubernetes.io/projected/4a742181-aebe-42f8-a83e-fee7b480366b-kube-api-access-xlgpb\") pod \"4a742181-aebe-42f8-a83e-fee7b480366b\" (UID: \"4a742181-aebe-42f8-a83e-fee7b480366b\") " Mar 01 09:36:06 crc kubenswrapper[4792]: I0301 09:36:06.787306 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a742181-aebe-42f8-a83e-fee7b480366b-kube-api-access-xlgpb" (OuterVolumeSpecName: "kube-api-access-xlgpb") pod "4a742181-aebe-42f8-a83e-fee7b480366b" (UID: "4a742181-aebe-42f8-a83e-fee7b480366b"). InnerVolumeSpecName "kube-api-access-xlgpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:36:06 crc kubenswrapper[4792]: I0301 09:36:06.790088 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a742181-aebe-42f8-a83e-fee7b480366b-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "4a742181-aebe-42f8-a83e-fee7b480366b" (UID: "4a742181-aebe-42f8-a83e-fee7b480366b"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:36:06 crc kubenswrapper[4792]: I0301 09:36:06.808256 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a742181-aebe-42f8-a83e-fee7b480366b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4a742181-aebe-42f8-a83e-fee7b480366b" (UID: "4a742181-aebe-42f8-a83e-fee7b480366b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:36:06 crc kubenswrapper[4792]: I0301 09:36:06.812197 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a742181-aebe-42f8-a83e-fee7b480366b-inventory" (OuterVolumeSpecName: "inventory") pod "4a742181-aebe-42f8-a83e-fee7b480366b" (UID: "4a742181-aebe-42f8-a83e-fee7b480366b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:36:06 crc kubenswrapper[4792]: I0301 09:36:06.884087 4792 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a742181-aebe-42f8-a83e-fee7b480366b-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:36:06 crc kubenswrapper[4792]: I0301 09:36:06.884261 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a742181-aebe-42f8-a83e-fee7b480366b-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:36:06 crc kubenswrapper[4792]: I0301 09:36:06.884316 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4a742181-aebe-42f8-a83e-fee7b480366b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:36:06 crc kubenswrapper[4792]: I0301 09:36:06.884392 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlgpb\" (UniqueName: \"kubernetes.io/projected/4a742181-aebe-42f8-a83e-fee7b480366b-kube-api-access-xlgpb\") on node \"crc\" DevicePath \"\"" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.227992 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.229584 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs" event={"ID":"4a742181-aebe-42f8-a83e-fee7b480366b","Type":"ContainerDied","Data":"b08f4fb0c894f674888aac3a423e143dc1d90f11637cf92b6939cf3c1832d890"} Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.229650 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b08f4fb0c894f674888aac3a423e143dc1d90f11637cf92b6939cf3c1832d890" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.300401 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2"] Mar 01 09:36:07 crc kubenswrapper[4792]: E0301 09:36:07.300877 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a742181-aebe-42f8-a83e-fee7b480366b" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.300898 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a742181-aebe-42f8-a83e-fee7b480366b" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 01 09:36:07 crc kubenswrapper[4792]: E0301 09:36:07.300950 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0029741-30a3-4fc2-b71d-c77dbd652c35" containerName="oc" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.300959 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0029741-30a3-4fc2-b71d-c77dbd652c35" containerName="oc" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.301159 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0029741-30a3-4fc2-b71d-c77dbd652c35" containerName="oc" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.301199 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a742181-aebe-42f8-a83e-fee7b480366b" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.305198 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.308558 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.309100 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.309222 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.309550 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2"] Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.311299 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.392733 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8grr9\" (UniqueName: \"kubernetes.io/projected/1f054d9d-4fbb-4909-826c-e6037c4716bd-kube-api-access-8grr9\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2\" (UID: \"1f054d9d-4fbb-4909-826c-e6037c4716bd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.393112 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f054d9d-4fbb-4909-826c-e6037c4716bd-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2\" (UID: \"1f054d9d-4fbb-4909-826c-e6037c4716bd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.393202 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f054d9d-4fbb-4909-826c-e6037c4716bd-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2\" (UID: \"1f054d9d-4fbb-4909-826c-e6037c4716bd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.417644 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3644e57-7093-4402-a6f2-48ed10ac14fa" path="/var/lib/kubelet/pods/d3644e57-7093-4402-a6f2-48ed10ac14fa/volumes" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.494696 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f054d9d-4fbb-4909-826c-e6037c4716bd-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2\" (UID: \"1f054d9d-4fbb-4909-826c-e6037c4716bd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.494751 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f054d9d-4fbb-4909-826c-e6037c4716bd-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2\" (UID: \"1f054d9d-4fbb-4909-826c-e6037c4716bd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.494816 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8grr9\" (UniqueName: \"kubernetes.io/projected/1f054d9d-4fbb-4909-826c-e6037c4716bd-kube-api-access-8grr9\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2\" (UID: \"1f054d9d-4fbb-4909-826c-e6037c4716bd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.498105 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f054d9d-4fbb-4909-826c-e6037c4716bd-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2\" (UID: \"1f054d9d-4fbb-4909-826c-e6037c4716bd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.498611 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f054d9d-4fbb-4909-826c-e6037c4716bd-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2\" (UID: \"1f054d9d-4fbb-4909-826c-e6037c4716bd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.511426 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8grr9\" (UniqueName: \"kubernetes.io/projected/1f054d9d-4fbb-4909-826c-e6037c4716bd-kube-api-access-8grr9\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2\" (UID: \"1f054d9d-4fbb-4909-826c-e6037c4716bd\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2" Mar 01 09:36:07 crc kubenswrapper[4792]: I0301 09:36:07.619159 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2" Mar 01 09:36:08 crc kubenswrapper[4792]: W0301 09:36:08.128305 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f054d9d_4fbb_4909_826c_e6037c4716bd.slice/crio-3d5568558fc77cac38af3f9d8b8a28b534315185559e9662b5fc399941b99b47 WatchSource:0}: Error finding container 3d5568558fc77cac38af3f9d8b8a28b534315185559e9662b5fc399941b99b47: Status 404 returned error can't find the container with id 3d5568558fc77cac38af3f9d8b8a28b534315185559e9662b5fc399941b99b47 Mar 01 09:36:08 crc kubenswrapper[4792]: I0301 09:36:08.129009 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2"] Mar 01 09:36:08 crc kubenswrapper[4792]: I0301 09:36:08.241132 4792 generic.go:334] "Generic (PLEG): container finished" podID="ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c" containerID="5e7498bd3071920c8c9e21cfd9c176f5c0382b50521ec3e8a9865375f48f1d83" exitCode=0 Mar 01 09:36:08 crc kubenswrapper[4792]: I0301 09:36:08.241208 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cbpls" event={"ID":"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c","Type":"ContainerDied","Data":"5e7498bd3071920c8c9e21cfd9c176f5c0382b50521ec3e8a9865375f48f1d83"} Mar 01 09:36:08 crc kubenswrapper[4792]: I0301 09:36:08.244342 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2" event={"ID":"1f054d9d-4fbb-4909-826c-e6037c4716bd","Type":"ContainerStarted","Data":"3d5568558fc77cac38af3f9d8b8a28b534315185559e9662b5fc399941b99b47"} Mar 01 09:36:09 crc kubenswrapper[4792]: I0301 09:36:09.263628 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2" event={"ID":"1f054d9d-4fbb-4909-826c-e6037c4716bd","Type":"ContainerStarted","Data":"2b11700043caea12ce3ec9b1a685865b6228db2c927ef984914abec9ff9701b8"} Mar 01 09:36:09 crc kubenswrapper[4792]: I0301 09:36:09.271078 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cbpls" event={"ID":"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c","Type":"ContainerStarted","Data":"b0264f4f8b8a32eaeb72b679e197ca563d73bc43b3c102185f04660359d72041"} Mar 01 09:36:09 crc kubenswrapper[4792]: I0301 09:36:09.282595 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2" podStartSLOduration=1.6839720759999999 podStartE2EDuration="2.28257585s" podCreationTimestamp="2026-03-01 09:36:07 +0000 UTC" firstStartedPulling="2026-03-01 09:36:08.130644381 +0000 UTC m=+1697.372523568" lastFinishedPulling="2026-03-01 09:36:08.729248145 +0000 UTC m=+1697.971127342" observedRunningTime="2026-03-01 09:36:09.277928242 +0000 UTC m=+1698.519807449" watchObservedRunningTime="2026-03-01 09:36:09.28257585 +0000 UTC m=+1698.524455047" Mar 01 09:36:09 crc kubenswrapper[4792]: I0301 09:36:09.307062 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cbpls" podStartSLOduration=1.878958288 podStartE2EDuration="5.307031621s" podCreationTimestamp="2026-03-01 09:36:04 +0000 UTC" firstStartedPulling="2026-03-01 09:36:05.183125536 +0000 UTC m=+1694.425004723" lastFinishedPulling="2026-03-01 09:36:08.611198859 +0000 UTC m=+1697.853078056" observedRunningTime="2026-03-01 09:36:09.302387813 +0000 UTC m=+1698.544267010" watchObservedRunningTime="2026-03-01 09:36:09.307031621 +0000 UTC m=+1698.548910818" Mar 01 09:36:14 crc kubenswrapper[4792]: I0301 09:36:14.425503 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cbpls" Mar 01 09:36:14 crc kubenswrapper[4792]: I0301 09:36:14.426111 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cbpls" Mar 01 09:36:14 crc kubenswrapper[4792]: I0301 09:36:14.485485 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cbpls" Mar 01 09:36:15 crc kubenswrapper[4792]: I0301 09:36:15.374397 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cbpls" Mar 01 09:36:15 crc kubenswrapper[4792]: I0301 09:36:15.430958 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cbpls"] Mar 01 09:36:17 crc kubenswrapper[4792]: I0301 09:36:17.337060 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cbpls" podUID="ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c" containerName="registry-server" containerID="cri-o://b0264f4f8b8a32eaeb72b679e197ca563d73bc43b3c102185f04660359d72041" gracePeriod=2 Mar 01 09:36:17 crc kubenswrapper[4792]: I0301 09:36:17.792280 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cbpls" Mar 01 09:36:17 crc kubenswrapper[4792]: I0301 09:36:17.908943 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c-catalog-content\") pod \"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c\" (UID: \"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c\") " Mar 01 09:36:17 crc kubenswrapper[4792]: I0301 09:36:17.909139 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c-utilities\") pod \"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c\" (UID: \"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c\") " Mar 01 09:36:17 crc kubenswrapper[4792]: I0301 09:36:17.909196 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5vt4\" (UniqueName: \"kubernetes.io/projected/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c-kube-api-access-t5vt4\") pod \"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c\" (UID: \"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c\") " Mar 01 09:36:17 crc kubenswrapper[4792]: I0301 09:36:17.910141 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c-utilities" (OuterVolumeSpecName: "utilities") pod "ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c" (UID: "ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:36:17 crc kubenswrapper[4792]: I0301 09:36:17.915412 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c-kube-api-access-t5vt4" (OuterVolumeSpecName: "kube-api-access-t5vt4") pod "ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c" (UID: "ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c"). InnerVolumeSpecName "kube-api-access-t5vt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:36:17 crc kubenswrapper[4792]: I0301 09:36:17.963098 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c" (UID: "ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:36:18 crc kubenswrapper[4792]: I0301 09:36:18.011289 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:36:18 crc kubenswrapper[4792]: I0301 09:36:18.011503 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:36:18 crc kubenswrapper[4792]: I0301 09:36:18.011603 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5vt4\" (UniqueName: \"kubernetes.io/projected/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c-kube-api-access-t5vt4\") on node \"crc\" DevicePath \"\"" Mar 01 09:36:18 crc kubenswrapper[4792]: I0301 09:36:18.348120 4792 generic.go:334] "Generic (PLEG): container finished" podID="ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c" containerID="b0264f4f8b8a32eaeb72b679e197ca563d73bc43b3c102185f04660359d72041" exitCode=0 Mar 01 09:36:18 crc kubenswrapper[4792]: I0301 09:36:18.348168 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cbpls" event={"ID":"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c","Type":"ContainerDied","Data":"b0264f4f8b8a32eaeb72b679e197ca563d73bc43b3c102185f04660359d72041"} Mar 01 09:36:18 crc kubenswrapper[4792]: I0301 09:36:18.348195 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cbpls" event={"ID":"ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c","Type":"ContainerDied","Data":"5d7b7f5ad9ee24fe834e1813c98805aeb552492372f5eec1abf81f2121a73b86"} Mar 01 09:36:18 crc kubenswrapper[4792]: I0301 09:36:18.348211 4792 scope.go:117] "RemoveContainer" containerID="b0264f4f8b8a32eaeb72b679e197ca563d73bc43b3c102185f04660359d72041" Mar 01 09:36:18 crc kubenswrapper[4792]: I0301 09:36:18.348358 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cbpls" Mar 01 09:36:18 crc kubenswrapper[4792]: I0301 09:36:18.365418 4792 scope.go:117] "RemoveContainer" containerID="5e7498bd3071920c8c9e21cfd9c176f5c0382b50521ec3e8a9865375f48f1d83" Mar 01 09:36:18 crc kubenswrapper[4792]: I0301 09:36:18.386635 4792 scope.go:117] "RemoveContainer" containerID="9c45ce853bc6b1ad8201ac31229a189b5789c0d6a2beb07af331ffabce6af29e" Mar 01 09:36:18 crc kubenswrapper[4792]: I0301 09:36:18.395373 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cbpls"] Mar 01 09:36:18 crc kubenswrapper[4792]: I0301 09:36:18.411795 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cbpls"] Mar 01 09:36:18 crc kubenswrapper[4792]: I0301 09:36:18.442017 4792 scope.go:117] "RemoveContainer" containerID="b0264f4f8b8a32eaeb72b679e197ca563d73bc43b3c102185f04660359d72041" Mar 01 09:36:18 crc kubenswrapper[4792]: E0301 09:36:18.442450 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0264f4f8b8a32eaeb72b679e197ca563d73bc43b3c102185f04660359d72041\": container with ID starting with b0264f4f8b8a32eaeb72b679e197ca563d73bc43b3c102185f04660359d72041 not found: ID does not exist" containerID="b0264f4f8b8a32eaeb72b679e197ca563d73bc43b3c102185f04660359d72041" Mar 01 09:36:18 crc kubenswrapper[4792]: I0301 09:36:18.442482 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0264f4f8b8a32eaeb72b679e197ca563d73bc43b3c102185f04660359d72041"} err="failed to get container status \"b0264f4f8b8a32eaeb72b679e197ca563d73bc43b3c102185f04660359d72041\": rpc error: code = NotFound desc = could not find container \"b0264f4f8b8a32eaeb72b679e197ca563d73bc43b3c102185f04660359d72041\": container with ID starting with b0264f4f8b8a32eaeb72b679e197ca563d73bc43b3c102185f04660359d72041 not found: ID does not exist" Mar 01 09:36:18 crc kubenswrapper[4792]: I0301 09:36:18.442508 4792 scope.go:117] "RemoveContainer" containerID="5e7498bd3071920c8c9e21cfd9c176f5c0382b50521ec3e8a9865375f48f1d83" Mar 01 09:36:18 crc kubenswrapper[4792]: E0301 09:36:18.442952 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e7498bd3071920c8c9e21cfd9c176f5c0382b50521ec3e8a9865375f48f1d83\": container with ID starting with 5e7498bd3071920c8c9e21cfd9c176f5c0382b50521ec3e8a9865375f48f1d83 not found: ID does not exist" containerID="5e7498bd3071920c8c9e21cfd9c176f5c0382b50521ec3e8a9865375f48f1d83" Mar 01 09:36:18 crc kubenswrapper[4792]: I0301 09:36:18.442978 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e7498bd3071920c8c9e21cfd9c176f5c0382b50521ec3e8a9865375f48f1d83"} err="failed to get container status \"5e7498bd3071920c8c9e21cfd9c176f5c0382b50521ec3e8a9865375f48f1d83\": rpc error: code = NotFound desc = could not find container \"5e7498bd3071920c8c9e21cfd9c176f5c0382b50521ec3e8a9865375f48f1d83\": container with ID starting with 5e7498bd3071920c8c9e21cfd9c176f5c0382b50521ec3e8a9865375f48f1d83 not found: ID does not exist" Mar 01 09:36:18 crc kubenswrapper[4792]: I0301 09:36:18.442995 4792 scope.go:117] "RemoveContainer" containerID="9c45ce853bc6b1ad8201ac31229a189b5789c0d6a2beb07af331ffabce6af29e" Mar 01 09:36:18 crc kubenswrapper[4792]: E0301 09:36:18.443283 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c45ce853bc6b1ad8201ac31229a189b5789c0d6a2beb07af331ffabce6af29e\": container with ID starting with 9c45ce853bc6b1ad8201ac31229a189b5789c0d6a2beb07af331ffabce6af29e not found: ID does not exist" containerID="9c45ce853bc6b1ad8201ac31229a189b5789c0d6a2beb07af331ffabce6af29e" Mar 01 09:36:18 crc kubenswrapper[4792]: I0301 09:36:18.443301 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c45ce853bc6b1ad8201ac31229a189b5789c0d6a2beb07af331ffabce6af29e"} err="failed to get container status \"9c45ce853bc6b1ad8201ac31229a189b5789c0d6a2beb07af331ffabce6af29e\": rpc error: code = NotFound desc = could not find container \"9c45ce853bc6b1ad8201ac31229a189b5789c0d6a2beb07af331ffabce6af29e\": container with ID starting with 9c45ce853bc6b1ad8201ac31229a189b5789c0d6a2beb07af331ffabce6af29e not found: ID does not exist" Mar 01 09:36:19 crc kubenswrapper[4792]: I0301 09:36:19.417726 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c" path="/var/lib/kubelet/pods/ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c/volumes" Mar 01 09:36:28 crc kubenswrapper[4792]: I0301 09:36:28.177183 4792 scope.go:117] "RemoveContainer" containerID="7926ce126d7f3dd092ea29933967e6329a351e44fde88116cf9663b118841513" Mar 01 09:36:34 crc kubenswrapper[4792]: I0301 09:36:34.944535 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:36:34 crc kubenswrapper[4792]: I0301 09:36:34.945260 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:37:04 crc kubenswrapper[4792]: I0301 09:37:04.942837 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:37:04 crc kubenswrapper[4792]: I0301 09:37:04.943504 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:37:10 crc kubenswrapper[4792]: I0301 09:37:10.057408 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-11c1-account-create-update-8h9xf"] Mar 01 09:37:10 crc kubenswrapper[4792]: I0301 09:37:10.067275 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-dlv4c"] Mar 01 09:37:10 crc kubenswrapper[4792]: I0301 09:37:10.078527 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-2c3a-account-create-update-vnvfb"] Mar 01 09:37:10 crc kubenswrapper[4792]: I0301 09:37:10.089753 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-f95nh"] Mar 01 09:37:10 crc kubenswrapper[4792]: I0301 09:37:10.097600 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-11c1-account-create-update-8h9xf"] Mar 01 09:37:10 crc kubenswrapper[4792]: I0301 09:37:10.107619 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-dlv4c"] Mar 01 09:37:10 crc kubenswrapper[4792]: I0301 09:37:10.117793 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-2c3a-account-create-update-vnvfb"] Mar 01 09:37:10 crc kubenswrapper[4792]: I0301 09:37:10.128303 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-f95nh"] Mar 01 09:37:11 crc kubenswrapper[4792]: I0301 09:37:11.026073 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-8zsss"] Mar 01 09:37:11 crc kubenswrapper[4792]: I0301 09:37:11.033854 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-d8c5-account-create-update-z4zgs"] Mar 01 09:37:11 crc kubenswrapper[4792]: I0301 09:37:11.052537 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-8zsss"] Mar 01 09:37:11 crc kubenswrapper[4792]: I0301 09:37:11.061458 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-d8c5-account-create-update-z4zgs"] Mar 01 09:37:11 crc kubenswrapper[4792]: I0301 09:37:11.418617 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="127158ae-b49c-42bd-932d-af85eafce8c0" path="/var/lib/kubelet/pods/127158ae-b49c-42bd-932d-af85eafce8c0/volumes" Mar 01 09:37:11 crc kubenswrapper[4792]: I0301 09:37:11.419206 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5" path="/var/lib/kubelet/pods/192b539c-c4b9-4c4e-93e3-23b6dc0d7ec5/volumes" Mar 01 09:37:11 crc kubenswrapper[4792]: I0301 09:37:11.419700 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="272107df-b15b-4c97-b9b0-e865f9a391da" path="/var/lib/kubelet/pods/272107df-b15b-4c97-b9b0-e865f9a391da/volumes" Mar 01 09:37:11 crc kubenswrapper[4792]: I0301 09:37:11.420204 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46d8b4e1-c1b5-468c-b319-84985c525d6a" path="/var/lib/kubelet/pods/46d8b4e1-c1b5-468c-b319-84985c525d6a/volumes" Mar 01 09:37:11 crc kubenswrapper[4792]: I0301 09:37:11.421227 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58e6fcdd-44b2-4c03-9cf6-a772bd0c3779" path="/var/lib/kubelet/pods/58e6fcdd-44b2-4c03-9cf6-a772bd0c3779/volumes" Mar 01 09:37:11 crc kubenswrapper[4792]: I0301 09:37:11.421726 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="869a99e5-f399-4938-ba59-bbe20e23385b" path="/var/lib/kubelet/pods/869a99e5-f399-4938-ba59-bbe20e23385b/volumes" Mar 01 09:37:24 crc kubenswrapper[4792]: I0301 09:37:24.034002 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-2dgrc"] Mar 01 09:37:24 crc kubenswrapper[4792]: I0301 09:37:24.040962 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-2dgrc"] Mar 01 09:37:24 crc kubenswrapper[4792]: I0301 09:37:24.824615 4792 generic.go:334] "Generic (PLEG): container finished" podID="1f054d9d-4fbb-4909-826c-e6037c4716bd" containerID="2b11700043caea12ce3ec9b1a685865b6228db2c927ef984914abec9ff9701b8" exitCode=0 Mar 01 09:37:24 crc kubenswrapper[4792]: I0301 09:37:24.824666 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2" event={"ID":"1f054d9d-4fbb-4909-826c-e6037c4716bd","Type":"ContainerDied","Data":"2b11700043caea12ce3ec9b1a685865b6228db2c927ef984914abec9ff9701b8"} Mar 01 09:37:25 crc kubenswrapper[4792]: I0301 09:37:25.423257 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9e80b4e-b68a-48a2-b0fe-e5cf19e00669" path="/var/lib/kubelet/pods/e9e80b4e-b68a-48a2-b0fe-e5cf19e00669/volumes" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.222351 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.293093 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8grr9\" (UniqueName: \"kubernetes.io/projected/1f054d9d-4fbb-4909-826c-e6037c4716bd-kube-api-access-8grr9\") pod \"1f054d9d-4fbb-4909-826c-e6037c4716bd\" (UID: \"1f054d9d-4fbb-4909-826c-e6037c4716bd\") " Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.293183 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f054d9d-4fbb-4909-826c-e6037c4716bd-inventory\") pod \"1f054d9d-4fbb-4909-826c-e6037c4716bd\" (UID: \"1f054d9d-4fbb-4909-826c-e6037c4716bd\") " Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.293286 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f054d9d-4fbb-4909-826c-e6037c4716bd-ssh-key-openstack-edpm-ipam\") pod \"1f054d9d-4fbb-4909-826c-e6037c4716bd\" (UID: \"1f054d9d-4fbb-4909-826c-e6037c4716bd\") " Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.304114 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f054d9d-4fbb-4909-826c-e6037c4716bd-kube-api-access-8grr9" (OuterVolumeSpecName: "kube-api-access-8grr9") pod "1f054d9d-4fbb-4909-826c-e6037c4716bd" (UID: "1f054d9d-4fbb-4909-826c-e6037c4716bd"). InnerVolumeSpecName "kube-api-access-8grr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.325597 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f054d9d-4fbb-4909-826c-e6037c4716bd-inventory" (OuterVolumeSpecName: "inventory") pod "1f054d9d-4fbb-4909-826c-e6037c4716bd" (UID: "1f054d9d-4fbb-4909-826c-e6037c4716bd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.326132 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f054d9d-4fbb-4909-826c-e6037c4716bd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1f054d9d-4fbb-4909-826c-e6037c4716bd" (UID: "1f054d9d-4fbb-4909-826c-e6037c4716bd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.395451 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8grr9\" (UniqueName: \"kubernetes.io/projected/1f054d9d-4fbb-4909-826c-e6037c4716bd-kube-api-access-8grr9\") on node \"crc\" DevicePath \"\"" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.395768 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f054d9d-4fbb-4909-826c-e6037c4716bd-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.396094 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f054d9d-4fbb-4909-826c-e6037c4716bd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.843566 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2" event={"ID":"1f054d9d-4fbb-4909-826c-e6037c4716bd","Type":"ContainerDied","Data":"3d5568558fc77cac38af3f9d8b8a28b534315185559e9662b5fc399941b99b47"} Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.843795 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d5568558fc77cac38af3f9d8b8a28b534315185559e9662b5fc399941b99b47" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.843627 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.932900 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw"] Mar 01 09:37:26 crc kubenswrapper[4792]: E0301 09:37:26.933276 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c" containerName="registry-server" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.933293 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c" containerName="registry-server" Mar 01 09:37:26 crc kubenswrapper[4792]: E0301 09:37:26.933303 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c" containerName="extract-utilities" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.933310 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c" containerName="extract-utilities" Mar 01 09:37:26 crc kubenswrapper[4792]: E0301 09:37:26.933324 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c" containerName="extract-content" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.933330 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c" containerName="extract-content" Mar 01 09:37:26 crc kubenswrapper[4792]: E0301 09:37:26.933343 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f054d9d-4fbb-4909-826c-e6037c4716bd" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.933350 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f054d9d-4fbb-4909-826c-e6037c4716bd" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.933511 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce3dfbd4-8d64-4be6-b93d-a3e300c4ed6c" containerName="registry-server" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.933531 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f054d9d-4fbb-4909-826c-e6037c4716bd" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.934292 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.936526 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.936530 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.936573 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.938960 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:37:26 crc kubenswrapper[4792]: I0301 09:37:26.965269 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw"] Mar 01 09:37:27 crc kubenswrapper[4792]: I0301 09:37:27.004668 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9af8a1fb-52d8-4b08-be39-ad106833ba1c-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw\" (UID: \"9af8a1fb-52d8-4b08-be39-ad106833ba1c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw" Mar 01 09:37:27 crc kubenswrapper[4792]: I0301 09:37:27.004712 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9gdq\" (UniqueName: \"kubernetes.io/projected/9af8a1fb-52d8-4b08-be39-ad106833ba1c-kube-api-access-h9gdq\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw\" (UID: \"9af8a1fb-52d8-4b08-be39-ad106833ba1c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw" Mar 01 09:37:27 crc kubenswrapper[4792]: I0301 09:37:27.004980 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9af8a1fb-52d8-4b08-be39-ad106833ba1c-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw\" (UID: \"9af8a1fb-52d8-4b08-be39-ad106833ba1c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw" Mar 01 09:37:27 crc kubenswrapper[4792]: I0301 09:37:27.106239 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9af8a1fb-52d8-4b08-be39-ad106833ba1c-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw\" (UID: \"9af8a1fb-52d8-4b08-be39-ad106833ba1c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw" Mar 01 09:37:27 crc kubenswrapper[4792]: I0301 09:37:27.106396 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9af8a1fb-52d8-4b08-be39-ad106833ba1c-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw\" (UID: \"9af8a1fb-52d8-4b08-be39-ad106833ba1c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw" Mar 01 09:37:27 crc kubenswrapper[4792]: I0301 09:37:27.106424 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9gdq\" (UniqueName: \"kubernetes.io/projected/9af8a1fb-52d8-4b08-be39-ad106833ba1c-kube-api-access-h9gdq\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw\" (UID: \"9af8a1fb-52d8-4b08-be39-ad106833ba1c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw" Mar 01 09:37:27 crc kubenswrapper[4792]: I0301 09:37:27.111344 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9af8a1fb-52d8-4b08-be39-ad106833ba1c-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw\" (UID: \"9af8a1fb-52d8-4b08-be39-ad106833ba1c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw" Mar 01 09:37:27 crc kubenswrapper[4792]: I0301 09:37:27.116375 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9af8a1fb-52d8-4b08-be39-ad106833ba1c-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw\" (UID: \"9af8a1fb-52d8-4b08-be39-ad106833ba1c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw" Mar 01 09:37:27 crc kubenswrapper[4792]: I0301 09:37:27.131478 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9gdq\" (UniqueName: \"kubernetes.io/projected/9af8a1fb-52d8-4b08-be39-ad106833ba1c-kube-api-access-h9gdq\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw\" (UID: \"9af8a1fb-52d8-4b08-be39-ad106833ba1c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw" Mar 01 09:37:27 crc kubenswrapper[4792]: I0301 09:37:27.258955 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw" Mar 01 09:37:27 crc kubenswrapper[4792]: I0301 09:37:27.836345 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw"] Mar 01 09:37:27 crc kubenswrapper[4792]: I0301 09:37:27.853744 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw" event={"ID":"9af8a1fb-52d8-4b08-be39-ad106833ba1c","Type":"ContainerStarted","Data":"1fed7b9f25154948ea1a944d03547c3dec990587c0db6a199eae71c8d98766c6"} Mar 01 09:37:28 crc kubenswrapper[4792]: I0301 09:37:28.260528 4792 scope.go:117] "RemoveContainer" containerID="98147a64ef321c5a2be94da9872cf3444a2d6ee5365cb23fe0e8d40238b6ab98" Mar 01 09:37:28 crc kubenswrapper[4792]: I0301 09:37:28.311080 4792 scope.go:117] "RemoveContainer" containerID="5c3a4231cfc20731f9ac2774fb470c532f5db1e9d44253c60e2e47577fa458dc" Mar 01 09:37:28 crc kubenswrapper[4792]: I0301 09:37:28.388397 4792 scope.go:117] "RemoveContainer" containerID="b99cdab13c59b3d72ed63dfb54dc704e52617818eb25d09b6ad0f435b22c114f" Mar 01 09:37:28 crc kubenswrapper[4792]: I0301 09:37:28.461093 4792 scope.go:117] "RemoveContainer" containerID="85f8d1f8a57c04591a9aaccb1305c025dd215cbe1527e2573cb115d042731951" Mar 01 09:37:28 crc kubenswrapper[4792]: I0301 09:37:28.506706 4792 scope.go:117] "RemoveContainer" containerID="3b9a5bf9216213ab73f7db6aa95b33bd1c546b1770c33a00558994664a8fc4ce" Mar 01 09:37:28 crc kubenswrapper[4792]: I0301 09:37:28.537587 4792 scope.go:117] "RemoveContainer" containerID="79a53da4edce2f856b264b84a40ae3b0fe791d8730afd70b1bb1b19a59aff3f9" Mar 01 09:37:28 crc kubenswrapper[4792]: I0301 09:37:28.565053 4792 scope.go:117] "RemoveContainer" containerID="0efefdd3dac5f3f586a3c6d6e7f2ba1305e9a8e8544b4e285a4ab7c3e12e8018" Mar 01 09:37:28 crc kubenswrapper[4792]: I0301 09:37:28.864960 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw" event={"ID":"9af8a1fb-52d8-4b08-be39-ad106833ba1c","Type":"ContainerStarted","Data":"263ab29e6e451a06b962c7883f7e5448fbe4595217de2d6fcca680e67789bfae"} Mar 01 09:37:28 crc kubenswrapper[4792]: I0301 09:37:28.888718 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw" podStartSLOduration=2.4177515339999998 podStartE2EDuration="2.888702418s" podCreationTimestamp="2026-03-01 09:37:26 +0000 UTC" firstStartedPulling="2026-03-01 09:37:27.840045161 +0000 UTC m=+1777.081924358" lastFinishedPulling="2026-03-01 09:37:28.310996025 +0000 UTC m=+1777.552875242" observedRunningTime="2026-03-01 09:37:28.882212963 +0000 UTC m=+1778.124092200" watchObservedRunningTime="2026-03-01 09:37:28.888702418 +0000 UTC m=+1778.130581615" Mar 01 09:37:33 crc kubenswrapper[4792]: I0301 09:37:33.918567 4792 generic.go:334] "Generic (PLEG): container finished" podID="9af8a1fb-52d8-4b08-be39-ad106833ba1c" containerID="263ab29e6e451a06b962c7883f7e5448fbe4595217de2d6fcca680e67789bfae" exitCode=0 Mar 01 09:37:33 crc kubenswrapper[4792]: I0301 09:37:33.918650 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw" event={"ID":"9af8a1fb-52d8-4b08-be39-ad106833ba1c","Type":"ContainerDied","Data":"263ab29e6e451a06b962c7883f7e5448fbe4595217de2d6fcca680e67789bfae"} Mar 01 09:37:34 crc kubenswrapper[4792]: I0301 09:37:34.943074 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:37:34 crc kubenswrapper[4792]: I0301 09:37:34.943126 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:37:34 crc kubenswrapper[4792]: I0301 09:37:34.943170 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:37:34 crc kubenswrapper[4792]: I0301 09:37:34.944067 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e"} pod="openshift-machine-config-operator/machine-config-daemon-bqszv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 01 09:37:34 crc kubenswrapper[4792]: I0301 09:37:34.944126 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" containerID="cri-o://60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" gracePeriod=600 Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.058167 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-ks68h"] Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.066486 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-ks68h"] Mar 01 09:37:35 crc kubenswrapper[4792]: E0301 09:37:35.076470 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.384797 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw" Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.462032 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89" path="/var/lib/kubelet/pods/72ae7bbf-4c3f-4182-9b2d-f7645b5a1c89/volumes" Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.470099 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9gdq\" (UniqueName: \"kubernetes.io/projected/9af8a1fb-52d8-4b08-be39-ad106833ba1c-kube-api-access-h9gdq\") pod \"9af8a1fb-52d8-4b08-be39-ad106833ba1c\" (UID: \"9af8a1fb-52d8-4b08-be39-ad106833ba1c\") " Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.470334 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9af8a1fb-52d8-4b08-be39-ad106833ba1c-ssh-key-openstack-edpm-ipam\") pod \"9af8a1fb-52d8-4b08-be39-ad106833ba1c\" (UID: \"9af8a1fb-52d8-4b08-be39-ad106833ba1c\") " Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.470553 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9af8a1fb-52d8-4b08-be39-ad106833ba1c-inventory\") pod \"9af8a1fb-52d8-4b08-be39-ad106833ba1c\" (UID: \"9af8a1fb-52d8-4b08-be39-ad106833ba1c\") " Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.476222 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9af8a1fb-52d8-4b08-be39-ad106833ba1c-kube-api-access-h9gdq" (OuterVolumeSpecName: "kube-api-access-h9gdq") pod "9af8a1fb-52d8-4b08-be39-ad106833ba1c" (UID: "9af8a1fb-52d8-4b08-be39-ad106833ba1c"). InnerVolumeSpecName "kube-api-access-h9gdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.496433 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9af8a1fb-52d8-4b08-be39-ad106833ba1c-inventory" (OuterVolumeSpecName: "inventory") pod "9af8a1fb-52d8-4b08-be39-ad106833ba1c" (UID: "9af8a1fb-52d8-4b08-be39-ad106833ba1c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.506307 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9af8a1fb-52d8-4b08-be39-ad106833ba1c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9af8a1fb-52d8-4b08-be39-ad106833ba1c" (UID: "9af8a1fb-52d8-4b08-be39-ad106833ba1c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.574119 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9gdq\" (UniqueName: \"kubernetes.io/projected/9af8a1fb-52d8-4b08-be39-ad106833ba1c-kube-api-access-h9gdq\") on node \"crc\" DevicePath \"\"" Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.574150 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9af8a1fb-52d8-4b08-be39-ad106833ba1c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.574193 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9af8a1fb-52d8-4b08-be39-ad106833ba1c-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.939050 4792 generic.go:334] "Generic (PLEG): container finished" podID="9105f6b0-6f16-47aa-8009-73736a90b765" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" exitCode=0 Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.939120 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerDied","Data":"60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e"} Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.939221 4792 scope.go:117] "RemoveContainer" containerID="6c6c336556c3895a23d652801049ab8fd2cc3ff89812dc0c31bb6831441e0e06" Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.939952 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:37:35 crc kubenswrapper[4792]: E0301 09:37:35.940348 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.940941 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw" Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.940942 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw" event={"ID":"9af8a1fb-52d8-4b08-be39-ad106833ba1c","Type":"ContainerDied","Data":"1fed7b9f25154948ea1a944d03547c3dec990587c0db6a199eae71c8d98766c6"} Mar 01 09:37:35 crc kubenswrapper[4792]: I0301 09:37:35.941013 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fed7b9f25154948ea1a944d03547c3dec990587c0db6a199eae71c8d98766c6" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.061797 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl"] Mar 01 09:37:36 crc kubenswrapper[4792]: E0301 09:37:36.062508 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9af8a1fb-52d8-4b08-be39-ad106833ba1c" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.062525 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9af8a1fb-52d8-4b08-be39-ad106833ba1c" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.062754 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9af8a1fb-52d8-4b08-be39-ad106833ba1c" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.063410 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.068447 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.068954 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.069394 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.072255 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.079447 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl"] Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.084565 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9642\" (UniqueName: \"kubernetes.io/projected/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8-kube-api-access-f9642\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kbjsl\" (UID: \"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.084681 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kbjsl\" (UID: \"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.084710 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kbjsl\" (UID: \"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.187496 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9642\" (UniqueName: \"kubernetes.io/projected/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8-kube-api-access-f9642\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kbjsl\" (UID: \"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.187717 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kbjsl\" (UID: \"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.189277 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kbjsl\" (UID: \"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.196591 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kbjsl\" (UID: \"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.196633 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kbjsl\" (UID: \"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.203881 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9642\" (UniqueName: \"kubernetes.io/projected/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8-kube-api-access-f9642\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kbjsl\" (UID: \"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.391966 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl" Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.932148 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl"] Mar 01 09:37:36 crc kubenswrapper[4792]: I0301 09:37:36.953174 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl" event={"ID":"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8","Type":"ContainerStarted","Data":"26c3c7763b6a7087dbe46227cb8263cd4e006283a3845857a05816e842cb3bd6"} Mar 01 09:37:37 crc kubenswrapper[4792]: I0301 09:37:37.964224 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl" event={"ID":"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8","Type":"ContainerStarted","Data":"40d6bf3bff7640be17f842b1e208ca6cbd13ff723d54eb172fb51e9f37d11d71"} Mar 01 09:37:37 crc kubenswrapper[4792]: I0301 09:37:37.984365 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl" podStartSLOduration=1.590300186 podStartE2EDuration="1.984344808s" podCreationTimestamp="2026-03-01 09:37:36 +0000 UTC" firstStartedPulling="2026-03-01 09:37:36.927628606 +0000 UTC m=+1786.169507803" lastFinishedPulling="2026-03-01 09:37:37.321673228 +0000 UTC m=+1786.563552425" observedRunningTime="2026-03-01 09:37:37.976486288 +0000 UTC m=+1787.218365505" watchObservedRunningTime="2026-03-01 09:37:37.984344808 +0000 UTC m=+1787.226224005" Mar 01 09:37:49 crc kubenswrapper[4792]: I0301 09:37:49.028017 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-t8rmw"] Mar 01 09:37:49 crc kubenswrapper[4792]: I0301 09:37:49.034752 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-t8rmw"] Mar 01 09:37:49 crc kubenswrapper[4792]: I0301 09:37:49.419298 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0b42afb-2954-442e-bc91-4c8275a4d2fd" path="/var/lib/kubelet/pods/f0b42afb-2954-442e-bc91-4c8275a4d2fd/volumes" Mar 01 09:37:51 crc kubenswrapper[4792]: I0301 09:37:51.413870 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:37:51 crc kubenswrapper[4792]: E0301 09:37:51.414352 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:37:52 crc kubenswrapper[4792]: I0301 09:37:52.029235 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-zxh6d"] Mar 01 09:37:52 crc kubenswrapper[4792]: I0301 09:37:52.036837 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-71d5-account-create-update-mjs9k"] Mar 01 09:37:52 crc kubenswrapper[4792]: I0301 09:37:52.044549 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-e676-account-create-update-5ntgh"] Mar 01 09:37:52 crc kubenswrapper[4792]: I0301 09:37:52.055329 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-zxh6d"] Mar 01 09:37:52 crc kubenswrapper[4792]: I0301 09:37:52.062774 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-kv8gv"] Mar 01 09:37:52 crc kubenswrapper[4792]: I0301 09:37:52.072206 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8714-account-create-update-wzssg"] Mar 01 09:37:52 crc kubenswrapper[4792]: I0301 09:37:52.080150 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-kv8gv"] Mar 01 09:37:52 crc kubenswrapper[4792]: I0301 09:37:52.087131 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-71d5-account-create-update-mjs9k"] Mar 01 09:37:52 crc kubenswrapper[4792]: I0301 09:37:52.093553 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-8714-account-create-update-wzssg"] Mar 01 09:37:52 crc kubenswrapper[4792]: I0301 09:37:52.099783 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-e676-account-create-update-5ntgh"] Mar 01 09:37:53 crc kubenswrapper[4792]: I0301 09:37:53.419587 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46b17f7c-595d-4b78-9076-037fb2998f60" path="/var/lib/kubelet/pods/46b17f7c-595d-4b78-9076-037fb2998f60/volumes" Mar 01 09:37:53 crc kubenswrapper[4792]: I0301 09:37:53.420562 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b715bb3f-b181-4614-85c5-9155286ce80c" path="/var/lib/kubelet/pods/b715bb3f-b181-4614-85c5-9155286ce80c/volumes" Mar 01 09:37:53 crc kubenswrapper[4792]: I0301 09:37:53.421102 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd689802-7b27-463e-a155-ed837e8594e6" path="/var/lib/kubelet/pods/bd689802-7b27-463e-a155-ed837e8594e6/volumes" Mar 01 09:37:53 crc kubenswrapper[4792]: I0301 09:37:53.421620 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dabb3d2e-57fa-4ad3-9f3b-b85e0b670650" path="/var/lib/kubelet/pods/dabb3d2e-57fa-4ad3-9f3b-b85e0b670650/volumes" Mar 01 09:37:53 crc kubenswrapper[4792]: I0301 09:37:53.422571 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efc2406b-db33-4a33-86f1-dd69b0f537a1" path="/var/lib/kubelet/pods/efc2406b-db33-4a33-86f1-dd69b0f537a1/volumes" Mar 01 09:37:57 crc kubenswrapper[4792]: I0301 09:37:57.036600 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-jsld8"] Mar 01 09:37:57 crc kubenswrapper[4792]: I0301 09:37:57.045270 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-jsld8"] Mar 01 09:37:57 crc kubenswrapper[4792]: I0301 09:37:57.421463 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="465282ce-1312-4cb6-ae89-de6ada48a901" path="/var/lib/kubelet/pods/465282ce-1312-4cb6-ae89-de6ada48a901/volumes" Mar 01 09:38:00 crc kubenswrapper[4792]: I0301 09:38:00.129500 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539298-ckkqh"] Mar 01 09:38:00 crc kubenswrapper[4792]: I0301 09:38:00.130786 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539298-ckkqh" Mar 01 09:38:00 crc kubenswrapper[4792]: I0301 09:38:00.132792 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:38:00 crc kubenswrapper[4792]: I0301 09:38:00.132989 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:38:00 crc kubenswrapper[4792]: I0301 09:38:00.134181 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:38:00 crc kubenswrapper[4792]: I0301 09:38:00.146665 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539298-ckkqh"] Mar 01 09:38:00 crc kubenswrapper[4792]: I0301 09:38:00.258676 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgrfb\" (UniqueName: \"kubernetes.io/projected/7e9f36fa-467b-4b49-9d69-b465a22837e5-kube-api-access-jgrfb\") pod \"auto-csr-approver-29539298-ckkqh\" (UID: \"7e9f36fa-467b-4b49-9d69-b465a22837e5\") " pod="openshift-infra/auto-csr-approver-29539298-ckkqh" Mar 01 09:38:00 crc kubenswrapper[4792]: I0301 09:38:00.361406 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgrfb\" (UniqueName: \"kubernetes.io/projected/7e9f36fa-467b-4b49-9d69-b465a22837e5-kube-api-access-jgrfb\") pod \"auto-csr-approver-29539298-ckkqh\" (UID: \"7e9f36fa-467b-4b49-9d69-b465a22837e5\") " pod="openshift-infra/auto-csr-approver-29539298-ckkqh" Mar 01 09:38:00 crc kubenswrapper[4792]: I0301 09:38:00.383893 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgrfb\" (UniqueName: \"kubernetes.io/projected/7e9f36fa-467b-4b49-9d69-b465a22837e5-kube-api-access-jgrfb\") pod \"auto-csr-approver-29539298-ckkqh\" (UID: \"7e9f36fa-467b-4b49-9d69-b465a22837e5\") " pod="openshift-infra/auto-csr-approver-29539298-ckkqh" Mar 01 09:38:00 crc kubenswrapper[4792]: I0301 09:38:00.453112 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539298-ckkqh" Mar 01 09:38:00 crc kubenswrapper[4792]: I0301 09:38:00.896382 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539298-ckkqh"] Mar 01 09:38:01 crc kubenswrapper[4792]: I0301 09:38:01.157565 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539298-ckkqh" event={"ID":"7e9f36fa-467b-4b49-9d69-b465a22837e5","Type":"ContainerStarted","Data":"c746804eed8758bb95b1fa6973b5ac0ac15ae1549125e1b4807551d5b907ac5b"} Mar 01 09:38:02 crc kubenswrapper[4792]: I0301 09:38:02.174547 4792 generic.go:334] "Generic (PLEG): container finished" podID="7e9f36fa-467b-4b49-9d69-b465a22837e5" containerID="fe8d25b14be5e63dea82359594e611616a93ae643f10a8ff38209498ecbc612f" exitCode=0 Mar 01 09:38:02 crc kubenswrapper[4792]: I0301 09:38:02.174651 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539298-ckkqh" event={"ID":"7e9f36fa-467b-4b49-9d69-b465a22837e5","Type":"ContainerDied","Data":"fe8d25b14be5e63dea82359594e611616a93ae643f10a8ff38209498ecbc612f"} Mar 01 09:38:03 crc kubenswrapper[4792]: I0301 09:38:03.488639 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539298-ckkqh" Mar 01 09:38:03 crc kubenswrapper[4792]: I0301 09:38:03.619811 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgrfb\" (UniqueName: \"kubernetes.io/projected/7e9f36fa-467b-4b49-9d69-b465a22837e5-kube-api-access-jgrfb\") pod \"7e9f36fa-467b-4b49-9d69-b465a22837e5\" (UID: \"7e9f36fa-467b-4b49-9d69-b465a22837e5\") " Mar 01 09:38:03 crc kubenswrapper[4792]: I0301 09:38:03.631082 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e9f36fa-467b-4b49-9d69-b465a22837e5-kube-api-access-jgrfb" (OuterVolumeSpecName: "kube-api-access-jgrfb") pod "7e9f36fa-467b-4b49-9d69-b465a22837e5" (UID: "7e9f36fa-467b-4b49-9d69-b465a22837e5"). InnerVolumeSpecName "kube-api-access-jgrfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:38:03 crc kubenswrapper[4792]: I0301 09:38:03.721828 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgrfb\" (UniqueName: \"kubernetes.io/projected/7e9f36fa-467b-4b49-9d69-b465a22837e5-kube-api-access-jgrfb\") on node \"crc\" DevicePath \"\"" Mar 01 09:38:04 crc kubenswrapper[4792]: I0301 09:38:04.192281 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539298-ckkqh" event={"ID":"7e9f36fa-467b-4b49-9d69-b465a22837e5","Type":"ContainerDied","Data":"c746804eed8758bb95b1fa6973b5ac0ac15ae1549125e1b4807551d5b907ac5b"} Mar 01 09:38:04 crc kubenswrapper[4792]: I0301 09:38:04.192333 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c746804eed8758bb95b1fa6973b5ac0ac15ae1549125e1b4807551d5b907ac5b" Mar 01 09:38:04 crc kubenswrapper[4792]: I0301 09:38:04.192344 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539298-ckkqh" Mar 01 09:38:04 crc kubenswrapper[4792]: I0301 09:38:04.409342 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:38:04 crc kubenswrapper[4792]: E0301 09:38:04.409581 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:38:04 crc kubenswrapper[4792]: I0301 09:38:04.548523 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539292-vszff"] Mar 01 09:38:04 crc kubenswrapper[4792]: I0301 09:38:04.556423 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539292-vszff"] Mar 01 09:38:05 crc kubenswrapper[4792]: I0301 09:38:05.417273 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc" path="/var/lib/kubelet/pods/13d3d8c8-6b26-4e5c-94f9-6fe6ae0519fc/volumes" Mar 01 09:38:15 crc kubenswrapper[4792]: I0301 09:38:15.280007 4792 generic.go:334] "Generic (PLEG): container finished" podID="31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8" containerID="40d6bf3bff7640be17f842b1e208ca6cbd13ff723d54eb172fb51e9f37d11d71" exitCode=0 Mar 01 09:38:15 crc kubenswrapper[4792]: I0301 09:38:15.280208 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl" event={"ID":"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8","Type":"ContainerDied","Data":"40d6bf3bff7640be17f842b1e208ca6cbd13ff723d54eb172fb51e9f37d11d71"} Mar 01 09:38:15 crc kubenswrapper[4792]: I0301 09:38:15.413171 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:38:15 crc kubenswrapper[4792]: E0301 09:38:15.413404 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:38:16 crc kubenswrapper[4792]: I0301 09:38:16.760095 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl" Mar 01 09:38:16 crc kubenswrapper[4792]: I0301 09:38:16.800255 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8-ssh-key-openstack-edpm-ipam\") pod \"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8\" (UID: \"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8\") " Mar 01 09:38:16 crc kubenswrapper[4792]: I0301 09:38:16.800336 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9642\" (UniqueName: \"kubernetes.io/projected/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8-kube-api-access-f9642\") pod \"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8\" (UID: \"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8\") " Mar 01 09:38:16 crc kubenswrapper[4792]: I0301 09:38:16.800419 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8-inventory\") pod \"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8\" (UID: \"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8\") " Mar 01 09:38:16 crc kubenswrapper[4792]: I0301 09:38:16.805711 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8-kube-api-access-f9642" (OuterVolumeSpecName: "kube-api-access-f9642") pod "31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8" (UID: "31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8"). InnerVolumeSpecName "kube-api-access-f9642". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:38:16 crc kubenswrapper[4792]: I0301 09:38:16.827142 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8-inventory" (OuterVolumeSpecName: "inventory") pod "31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8" (UID: "31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:38:16 crc kubenswrapper[4792]: I0301 09:38:16.829522 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8" (UID: "31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:38:16 crc kubenswrapper[4792]: I0301 09:38:16.902027 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:38:16 crc kubenswrapper[4792]: I0301 09:38:16.902056 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9642\" (UniqueName: \"kubernetes.io/projected/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8-kube-api-access-f9642\") on node \"crc\" DevicePath \"\"" Mar 01 09:38:16 crc kubenswrapper[4792]: I0301 09:38:16.902068 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.297661 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl" event={"ID":"31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8","Type":"ContainerDied","Data":"26c3c7763b6a7087dbe46227cb8263cd4e006283a3845857a05816e842cb3bd6"} Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.297700 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26c3c7763b6a7087dbe46227cb8263cd4e006283a3845857a05816e842cb3bd6" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.297754 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.379991 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt"] Mar 01 09:38:17 crc kubenswrapper[4792]: E0301 09:38:17.380407 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.380430 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:38:17 crc kubenswrapper[4792]: E0301 09:38:17.380442 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e9f36fa-467b-4b49-9d69-b465a22837e5" containerName="oc" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.380452 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e9f36fa-467b-4b49-9d69-b465a22837e5" containerName="oc" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.380710 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.380733 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e9f36fa-467b-4b49-9d69-b465a22837e5" containerName="oc" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.381451 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.383509 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.384525 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.384988 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.390920 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt"] Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.392644 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.511629 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8787b5ba-7462-4594-a11d-2d0afbfe3c1c-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt\" (UID: \"8787b5ba-7462-4594-a11d-2d0afbfe3c1c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.511689 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8787b5ba-7462-4594-a11d-2d0afbfe3c1c-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt\" (UID: \"8787b5ba-7462-4594-a11d-2d0afbfe3c1c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.511810 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lqcj\" (UniqueName: \"kubernetes.io/projected/8787b5ba-7462-4594-a11d-2d0afbfe3c1c-kube-api-access-6lqcj\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt\" (UID: \"8787b5ba-7462-4594-a11d-2d0afbfe3c1c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.613938 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lqcj\" (UniqueName: \"kubernetes.io/projected/8787b5ba-7462-4594-a11d-2d0afbfe3c1c-kube-api-access-6lqcj\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt\" (UID: \"8787b5ba-7462-4594-a11d-2d0afbfe3c1c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.614047 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8787b5ba-7462-4594-a11d-2d0afbfe3c1c-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt\" (UID: \"8787b5ba-7462-4594-a11d-2d0afbfe3c1c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.614072 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8787b5ba-7462-4594-a11d-2d0afbfe3c1c-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt\" (UID: \"8787b5ba-7462-4594-a11d-2d0afbfe3c1c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.617549 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8787b5ba-7462-4594-a11d-2d0afbfe3c1c-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt\" (UID: \"8787b5ba-7462-4594-a11d-2d0afbfe3c1c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.618464 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8787b5ba-7462-4594-a11d-2d0afbfe3c1c-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt\" (UID: \"8787b5ba-7462-4594-a11d-2d0afbfe3c1c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.634046 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lqcj\" (UniqueName: \"kubernetes.io/projected/8787b5ba-7462-4594-a11d-2d0afbfe3c1c-kube-api-access-6lqcj\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt\" (UID: \"8787b5ba-7462-4594-a11d-2d0afbfe3c1c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt" Mar 01 09:38:17 crc kubenswrapper[4792]: I0301 09:38:17.697093 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt" Mar 01 09:38:18 crc kubenswrapper[4792]: W0301 09:38:18.259185 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8787b5ba_7462_4594_a11d_2d0afbfe3c1c.slice/crio-cc836aef7a180983b871b836e0db2dda5971b8cc57635fccf899b4b8a37fabc2 WatchSource:0}: Error finding container cc836aef7a180983b871b836e0db2dda5971b8cc57635fccf899b4b8a37fabc2: Status 404 returned error can't find the container with id cc836aef7a180983b871b836e0db2dda5971b8cc57635fccf899b4b8a37fabc2 Mar 01 09:38:18 crc kubenswrapper[4792]: I0301 09:38:18.264926 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt"] Mar 01 09:38:18 crc kubenswrapper[4792]: I0301 09:38:18.305509 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt" event={"ID":"8787b5ba-7462-4594-a11d-2d0afbfe3c1c","Type":"ContainerStarted","Data":"cc836aef7a180983b871b836e0db2dda5971b8cc57635fccf899b4b8a37fabc2"} Mar 01 09:38:19 crc kubenswrapper[4792]: I0301 09:38:19.316387 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt" event={"ID":"8787b5ba-7462-4594-a11d-2d0afbfe3c1c","Type":"ContainerStarted","Data":"a0ae22fc112a93c604faf629d5ca18987a7aa343d92829f2e65e4501f2454496"} Mar 01 09:38:19 crc kubenswrapper[4792]: I0301 09:38:19.338018 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt" podStartSLOduration=1.857130669 podStartE2EDuration="2.337999675s" podCreationTimestamp="2026-03-01 09:38:17 +0000 UTC" firstStartedPulling="2026-03-01 09:38:18.261493321 +0000 UTC m=+1827.503372518" lastFinishedPulling="2026-03-01 09:38:18.742362327 +0000 UTC m=+1827.984241524" observedRunningTime="2026-03-01 09:38:19.337744198 +0000 UTC m=+1828.579623395" watchObservedRunningTime="2026-03-01 09:38:19.337999675 +0000 UTC m=+1828.579878872" Mar 01 09:38:23 crc kubenswrapper[4792]: I0301 09:38:23.346882 4792 generic.go:334] "Generic (PLEG): container finished" podID="8787b5ba-7462-4594-a11d-2d0afbfe3c1c" containerID="a0ae22fc112a93c604faf629d5ca18987a7aa343d92829f2e65e4501f2454496" exitCode=0 Mar 01 09:38:23 crc kubenswrapper[4792]: I0301 09:38:23.347421 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt" event={"ID":"8787b5ba-7462-4594-a11d-2d0afbfe3c1c","Type":"ContainerDied","Data":"a0ae22fc112a93c604faf629d5ca18987a7aa343d92829f2e65e4501f2454496"} Mar 01 09:38:24 crc kubenswrapper[4792]: I0301 09:38:24.801660 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt" Mar 01 09:38:24 crc kubenswrapper[4792]: I0301 09:38:24.960098 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lqcj\" (UniqueName: \"kubernetes.io/projected/8787b5ba-7462-4594-a11d-2d0afbfe3c1c-kube-api-access-6lqcj\") pod \"8787b5ba-7462-4594-a11d-2d0afbfe3c1c\" (UID: \"8787b5ba-7462-4594-a11d-2d0afbfe3c1c\") " Mar 01 09:38:24 crc kubenswrapper[4792]: I0301 09:38:24.960209 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8787b5ba-7462-4594-a11d-2d0afbfe3c1c-inventory\") pod \"8787b5ba-7462-4594-a11d-2d0afbfe3c1c\" (UID: \"8787b5ba-7462-4594-a11d-2d0afbfe3c1c\") " Mar 01 09:38:24 crc kubenswrapper[4792]: I0301 09:38:24.960417 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8787b5ba-7462-4594-a11d-2d0afbfe3c1c-ssh-key-openstack-edpm-ipam\") pod \"8787b5ba-7462-4594-a11d-2d0afbfe3c1c\" (UID: \"8787b5ba-7462-4594-a11d-2d0afbfe3c1c\") " Mar 01 09:38:24 crc kubenswrapper[4792]: I0301 09:38:24.966233 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8787b5ba-7462-4594-a11d-2d0afbfe3c1c-kube-api-access-6lqcj" (OuterVolumeSpecName: "kube-api-access-6lqcj") pod "8787b5ba-7462-4594-a11d-2d0afbfe3c1c" (UID: "8787b5ba-7462-4594-a11d-2d0afbfe3c1c"). InnerVolumeSpecName "kube-api-access-6lqcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:38:24 crc kubenswrapper[4792]: I0301 09:38:24.986652 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8787b5ba-7462-4594-a11d-2d0afbfe3c1c-inventory" (OuterVolumeSpecName: "inventory") pod "8787b5ba-7462-4594-a11d-2d0afbfe3c1c" (UID: "8787b5ba-7462-4594-a11d-2d0afbfe3c1c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:38:24 crc kubenswrapper[4792]: I0301 09:38:24.998494 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8787b5ba-7462-4594-a11d-2d0afbfe3c1c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8787b5ba-7462-4594-a11d-2d0afbfe3c1c" (UID: "8787b5ba-7462-4594-a11d-2d0afbfe3c1c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.062985 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lqcj\" (UniqueName: \"kubernetes.io/projected/8787b5ba-7462-4594-a11d-2d0afbfe3c1c-kube-api-access-6lqcj\") on node \"crc\" DevicePath \"\"" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.063286 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8787b5ba-7462-4594-a11d-2d0afbfe3c1c-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.063296 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8787b5ba-7462-4594-a11d-2d0afbfe3c1c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.370416 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt" event={"ID":"8787b5ba-7462-4594-a11d-2d0afbfe3c1c","Type":"ContainerDied","Data":"cc836aef7a180983b871b836e0db2dda5971b8cc57635fccf899b4b8a37fabc2"} Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.370495 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc836aef7a180983b871b836e0db2dda5971b8cc57635fccf899b4b8a37fabc2" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.370566 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.445040 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj"] Mar 01 09:38:25 crc kubenswrapper[4792]: E0301 09:38:25.445477 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8787b5ba-7462-4594-a11d-2d0afbfe3c1c" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.445505 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8787b5ba-7462-4594-a11d-2d0afbfe3c1c" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.445752 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="8787b5ba-7462-4594-a11d-2d0afbfe3c1c" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.451376 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.453282 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.453334 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.454114 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.454609 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.460237 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj"] Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.480263 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6a7e948-b141-4fb0-b717-3d02a9014dd4-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj\" (UID: \"a6a7e948-b141-4fb0-b717-3d02a9014dd4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.480306 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgrps\" (UniqueName: \"kubernetes.io/projected/a6a7e948-b141-4fb0-b717-3d02a9014dd4-kube-api-access-tgrps\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj\" (UID: \"a6a7e948-b141-4fb0-b717-3d02a9014dd4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.480346 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a6a7e948-b141-4fb0-b717-3d02a9014dd4-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj\" (UID: \"a6a7e948-b141-4fb0-b717-3d02a9014dd4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.582245 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6a7e948-b141-4fb0-b717-3d02a9014dd4-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj\" (UID: \"a6a7e948-b141-4fb0-b717-3d02a9014dd4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.582474 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgrps\" (UniqueName: \"kubernetes.io/projected/a6a7e948-b141-4fb0-b717-3d02a9014dd4-kube-api-access-tgrps\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj\" (UID: \"a6a7e948-b141-4fb0-b717-3d02a9014dd4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.582577 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a6a7e948-b141-4fb0-b717-3d02a9014dd4-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj\" (UID: \"a6a7e948-b141-4fb0-b717-3d02a9014dd4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.586048 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6a7e948-b141-4fb0-b717-3d02a9014dd4-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj\" (UID: \"a6a7e948-b141-4fb0-b717-3d02a9014dd4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.586225 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a6a7e948-b141-4fb0-b717-3d02a9014dd4-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj\" (UID: \"a6a7e948-b141-4fb0-b717-3d02a9014dd4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.606015 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgrps\" (UniqueName: \"kubernetes.io/projected/a6a7e948-b141-4fb0-b717-3d02a9014dd4-kube-api-access-tgrps\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj\" (UID: \"a6a7e948-b141-4fb0-b717-3d02a9014dd4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj" Mar 01 09:38:25 crc kubenswrapper[4792]: I0301 09:38:25.769436 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj" Mar 01 09:38:26 crc kubenswrapper[4792]: I0301 09:38:26.260444 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj"] Mar 01 09:38:26 crc kubenswrapper[4792]: I0301 09:38:26.378562 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj" event={"ID":"a6a7e948-b141-4fb0-b717-3d02a9014dd4","Type":"ContainerStarted","Data":"05fc82db8e8a9121da2f3c828eec2d2e2261fe72fe811c5f9e0e91a7c4ce867b"} Mar 01 09:38:27 crc kubenswrapper[4792]: I0301 09:38:27.041412 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-gbmwh"] Mar 01 09:38:27 crc kubenswrapper[4792]: I0301 09:38:27.049384 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-gbmwh"] Mar 01 09:38:27 crc kubenswrapper[4792]: I0301 09:38:27.386768 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj" event={"ID":"a6a7e948-b141-4fb0-b717-3d02a9014dd4","Type":"ContainerStarted","Data":"291e30822651ff807afe1c8290d577ee02386c82661acb3758a8b13541958167"} Mar 01 09:38:27 crc kubenswrapper[4792]: I0301 09:38:27.409041 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj" podStartSLOduration=2.010579763 podStartE2EDuration="2.409020157s" podCreationTimestamp="2026-03-01 09:38:25 +0000 UTC" firstStartedPulling="2026-03-01 09:38:26.279762933 +0000 UTC m=+1835.521642130" lastFinishedPulling="2026-03-01 09:38:26.678203327 +0000 UTC m=+1835.920082524" observedRunningTime="2026-03-01 09:38:27.405897158 +0000 UTC m=+1836.647776355" watchObservedRunningTime="2026-03-01 09:38:27.409020157 +0000 UTC m=+1836.650899354" Mar 01 09:38:27 crc kubenswrapper[4792]: I0301 09:38:27.420320 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66aba873-81b0-452a-81f9-73cc18445180" path="/var/lib/kubelet/pods/66aba873-81b0-452a-81f9-73cc18445180/volumes" Mar 01 09:38:28 crc kubenswrapper[4792]: I0301 09:38:28.409400 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:38:28 crc kubenswrapper[4792]: E0301 09:38:28.409687 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:38:28 crc kubenswrapper[4792]: I0301 09:38:28.702408 4792 scope.go:117] "RemoveContainer" containerID="3d36f5fe200b4f79f67807598d19a358fe63f35f70500bebea7ecf29d4c8c11d" Mar 01 09:38:28 crc kubenswrapper[4792]: I0301 09:38:28.733014 4792 scope.go:117] "RemoveContainer" containerID="2a9eb88c21c0505fd080c3b8fba46cc255546b5fb4c130561920988c70383a89" Mar 01 09:38:28 crc kubenswrapper[4792]: I0301 09:38:28.766745 4792 scope.go:117] "RemoveContainer" containerID="01f9646a0afc7be0a0075cd21cc75eb15ebfcd51abc1ad1bd7ae6a925a4bfdd3" Mar 01 09:38:28 crc kubenswrapper[4792]: I0301 09:38:28.814674 4792 scope.go:117] "RemoveContainer" containerID="a5333afd5d7c2f19e4d0551bd45c113ef37b9f8fcc1a7b85eb962769ca9d63e5" Mar 01 09:38:28 crc kubenswrapper[4792]: I0301 09:38:28.845842 4792 scope.go:117] "RemoveContainer" containerID="1cc52ebb7e1b86f46dbab0e11949d60082faaf96962f0529b5c27c6156f59218" Mar 01 09:38:28 crc kubenswrapper[4792]: I0301 09:38:28.877354 4792 scope.go:117] "RemoveContainer" containerID="8ffaa11ad79d37459635175d7fbda620c8204659760d7268eb92ed800bd1a03d" Mar 01 09:38:28 crc kubenswrapper[4792]: I0301 09:38:28.929961 4792 scope.go:117] "RemoveContainer" containerID="d1c76ac502d7f1c626951dc28dbcf8372a61e85a2c8a22e8bee3f4ce1c1f91c2" Mar 01 09:38:28 crc kubenswrapper[4792]: I0301 09:38:28.951844 4792 scope.go:117] "RemoveContainer" containerID="ca00fe55b9ec531c46f7a5dbc120ce6185518403d6904d883bc1ec756288e2a0" Mar 01 09:38:28 crc kubenswrapper[4792]: I0301 09:38:28.983294 4792 scope.go:117] "RemoveContainer" containerID="d49c80ba137c1dfc24f0da7a4050addac018dbbe5eed7701b9bf0c31b472eef5" Mar 01 09:38:29 crc kubenswrapper[4792]: I0301 09:38:29.007826 4792 scope.go:117] "RemoveContainer" containerID="acaeed4a0d4cc4c819f11994601b2946e5e093014d4fc45dbb6ce057d16aef6a" Mar 01 09:38:30 crc kubenswrapper[4792]: I0301 09:38:30.029349 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-f89zl"] Mar 01 09:38:30 crc kubenswrapper[4792]: I0301 09:38:30.039624 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-f89zl"] Mar 01 09:38:31 crc kubenswrapper[4792]: I0301 09:38:31.420535 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e623b24a-64a5-4209-86bb-1814ae9c400b" path="/var/lib/kubelet/pods/e623b24a-64a5-4209-86bb-1814ae9c400b/volumes" Mar 01 09:38:33 crc kubenswrapper[4792]: I0301 09:38:33.028687 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-m9l5f"] Mar 01 09:38:33 crc kubenswrapper[4792]: I0301 09:38:33.036239 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-m9l5f"] Mar 01 09:38:33 crc kubenswrapper[4792]: I0301 09:38:33.419328 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7108e9ac-8215-41ca-ac84-3b3851142a42" path="/var/lib/kubelet/pods/7108e9ac-8215-41ca-ac84-3b3851142a42/volumes" Mar 01 09:38:41 crc kubenswrapper[4792]: I0301 09:38:41.413768 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:38:41 crc kubenswrapper[4792]: E0301 09:38:41.414509 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:38:48 crc kubenswrapper[4792]: I0301 09:38:48.033450 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-bxx5d"] Mar 01 09:38:48 crc kubenswrapper[4792]: I0301 09:38:48.041062 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-bxx5d"] Mar 01 09:38:49 crc kubenswrapper[4792]: I0301 09:38:49.418400 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e6bad7a-881b-4ef4-9916-f447e2fc1ffd" path="/var/lib/kubelet/pods/9e6bad7a-881b-4ef4-9916-f447e2fc1ffd/volumes" Mar 01 09:38:51 crc kubenswrapper[4792]: I0301 09:38:51.037570 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-gsxqb"] Mar 01 09:38:51 crc kubenswrapper[4792]: I0301 09:38:51.046776 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-gsxqb"] Mar 01 09:38:51 crc kubenswrapper[4792]: I0301 09:38:51.422354 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="737aa0a0-6e53-451e-9d5f-2deada87b5b4" path="/var/lib/kubelet/pods/737aa0a0-6e53-451e-9d5f-2deada87b5b4/volumes" Mar 01 09:38:52 crc kubenswrapper[4792]: I0301 09:38:52.408677 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:38:52 crc kubenswrapper[4792]: E0301 09:38:52.409217 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:39:04 crc kubenswrapper[4792]: I0301 09:39:04.412181 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:39:04 crc kubenswrapper[4792]: E0301 09:39:04.412882 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:39:15 crc kubenswrapper[4792]: I0301 09:39:15.408996 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:39:15 crc kubenswrapper[4792]: E0301 09:39:15.409842 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:39:18 crc kubenswrapper[4792]: I0301 09:39:18.827509 4792 generic.go:334] "Generic (PLEG): container finished" podID="a6a7e948-b141-4fb0-b717-3d02a9014dd4" containerID="291e30822651ff807afe1c8290d577ee02386c82661acb3758a8b13541958167" exitCode=0 Mar 01 09:39:18 crc kubenswrapper[4792]: I0301 09:39:18.827573 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj" event={"ID":"a6a7e948-b141-4fb0-b717-3d02a9014dd4","Type":"ContainerDied","Data":"291e30822651ff807afe1c8290d577ee02386c82661acb3758a8b13541958167"} Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.228202 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj" Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.351006 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgrps\" (UniqueName: \"kubernetes.io/projected/a6a7e948-b141-4fb0-b717-3d02a9014dd4-kube-api-access-tgrps\") pod \"a6a7e948-b141-4fb0-b717-3d02a9014dd4\" (UID: \"a6a7e948-b141-4fb0-b717-3d02a9014dd4\") " Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.351206 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6a7e948-b141-4fb0-b717-3d02a9014dd4-inventory\") pod \"a6a7e948-b141-4fb0-b717-3d02a9014dd4\" (UID: \"a6a7e948-b141-4fb0-b717-3d02a9014dd4\") " Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.351301 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a6a7e948-b141-4fb0-b717-3d02a9014dd4-ssh-key-openstack-edpm-ipam\") pod \"a6a7e948-b141-4fb0-b717-3d02a9014dd4\" (UID: \"a6a7e948-b141-4fb0-b717-3d02a9014dd4\") " Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.356962 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6a7e948-b141-4fb0-b717-3d02a9014dd4-kube-api-access-tgrps" (OuterVolumeSpecName: "kube-api-access-tgrps") pod "a6a7e948-b141-4fb0-b717-3d02a9014dd4" (UID: "a6a7e948-b141-4fb0-b717-3d02a9014dd4"). InnerVolumeSpecName "kube-api-access-tgrps". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.393456 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6a7e948-b141-4fb0-b717-3d02a9014dd4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a6a7e948-b141-4fb0-b717-3d02a9014dd4" (UID: "a6a7e948-b141-4fb0-b717-3d02a9014dd4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.395353 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6a7e948-b141-4fb0-b717-3d02a9014dd4-inventory" (OuterVolumeSpecName: "inventory") pod "a6a7e948-b141-4fb0-b717-3d02a9014dd4" (UID: "a6a7e948-b141-4fb0-b717-3d02a9014dd4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.453966 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6a7e948-b141-4fb0-b717-3d02a9014dd4-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.454641 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a6a7e948-b141-4fb0-b717-3d02a9014dd4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.454656 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgrps\" (UniqueName: \"kubernetes.io/projected/a6a7e948-b141-4fb0-b717-3d02a9014dd4-kube-api-access-tgrps\") on node \"crc\" DevicePath \"\"" Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.845699 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj" event={"ID":"a6a7e948-b141-4fb0-b717-3d02a9014dd4","Type":"ContainerDied","Data":"05fc82db8e8a9121da2f3c828eec2d2e2261fe72fe811c5f9e0e91a7c4ce867b"} Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.845740 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05fc82db8e8a9121da2f3c828eec2d2e2261fe72fe811c5f9e0e91a7c4ce867b" Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.845792 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj" Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.939835 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-27xxx"] Mar 01 09:39:20 crc kubenswrapper[4792]: E0301 09:39:20.940388 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6a7e948-b141-4fb0-b717-3d02a9014dd4" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.940409 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6a7e948-b141-4fb0-b717-3d02a9014dd4" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.940625 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6a7e948-b141-4fb0-b717-3d02a9014dd4" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.941395 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-27xxx" Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.943694 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.943945 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.944181 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.944390 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:39:20 crc kubenswrapper[4792]: I0301 09:39:20.962974 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-27xxx"] Mar 01 09:39:21 crc kubenswrapper[4792]: I0301 09:39:21.065795 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b5ccb279-c8b2-4288-9072-1175061be204-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-27xxx\" (UID: \"b5ccb279-c8b2-4288-9072-1175061be204\") " pod="openstack/ssh-known-hosts-edpm-deployment-27xxx" Mar 01 09:39:21 crc kubenswrapper[4792]: I0301 09:39:21.065921 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8whbm\" (UniqueName: \"kubernetes.io/projected/b5ccb279-c8b2-4288-9072-1175061be204-kube-api-access-8whbm\") pod \"ssh-known-hosts-edpm-deployment-27xxx\" (UID: \"b5ccb279-c8b2-4288-9072-1175061be204\") " pod="openstack/ssh-known-hosts-edpm-deployment-27xxx" Mar 01 09:39:21 crc kubenswrapper[4792]: I0301 09:39:21.066004 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b5ccb279-c8b2-4288-9072-1175061be204-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-27xxx\" (UID: \"b5ccb279-c8b2-4288-9072-1175061be204\") " pod="openstack/ssh-known-hosts-edpm-deployment-27xxx" Mar 01 09:39:21 crc kubenswrapper[4792]: I0301 09:39:21.167736 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8whbm\" (UniqueName: \"kubernetes.io/projected/b5ccb279-c8b2-4288-9072-1175061be204-kube-api-access-8whbm\") pod \"ssh-known-hosts-edpm-deployment-27xxx\" (UID: \"b5ccb279-c8b2-4288-9072-1175061be204\") " pod="openstack/ssh-known-hosts-edpm-deployment-27xxx" Mar 01 09:39:21 crc kubenswrapper[4792]: I0301 09:39:21.167792 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b5ccb279-c8b2-4288-9072-1175061be204-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-27xxx\" (UID: \"b5ccb279-c8b2-4288-9072-1175061be204\") " pod="openstack/ssh-known-hosts-edpm-deployment-27xxx" Mar 01 09:39:21 crc kubenswrapper[4792]: I0301 09:39:21.167920 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b5ccb279-c8b2-4288-9072-1175061be204-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-27xxx\" (UID: \"b5ccb279-c8b2-4288-9072-1175061be204\") " pod="openstack/ssh-known-hosts-edpm-deployment-27xxx" Mar 01 09:39:21 crc kubenswrapper[4792]: I0301 09:39:21.175862 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b5ccb279-c8b2-4288-9072-1175061be204-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-27xxx\" (UID: \"b5ccb279-c8b2-4288-9072-1175061be204\") " pod="openstack/ssh-known-hosts-edpm-deployment-27xxx" Mar 01 09:39:21 crc kubenswrapper[4792]: I0301 09:39:21.176382 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b5ccb279-c8b2-4288-9072-1175061be204-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-27xxx\" (UID: \"b5ccb279-c8b2-4288-9072-1175061be204\") " pod="openstack/ssh-known-hosts-edpm-deployment-27xxx" Mar 01 09:39:21 crc kubenswrapper[4792]: I0301 09:39:21.185421 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8whbm\" (UniqueName: \"kubernetes.io/projected/b5ccb279-c8b2-4288-9072-1175061be204-kube-api-access-8whbm\") pod \"ssh-known-hosts-edpm-deployment-27xxx\" (UID: \"b5ccb279-c8b2-4288-9072-1175061be204\") " pod="openstack/ssh-known-hosts-edpm-deployment-27xxx" Mar 01 09:39:21 crc kubenswrapper[4792]: I0301 09:39:21.259729 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-27xxx" Mar 01 09:39:21 crc kubenswrapper[4792]: I0301 09:39:21.761232 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-27xxx"] Mar 01 09:39:21 crc kubenswrapper[4792]: W0301 09:39:21.768036 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5ccb279_c8b2_4288_9072_1175061be204.slice/crio-1bbb3c139b7d53c7d463ca62df59c4f4dd38f92de5cd1ea315a5ed35bc41088b WatchSource:0}: Error finding container 1bbb3c139b7d53c7d463ca62df59c4f4dd38f92de5cd1ea315a5ed35bc41088b: Status 404 returned error can't find the container with id 1bbb3c139b7d53c7d463ca62df59c4f4dd38f92de5cd1ea315a5ed35bc41088b Mar 01 09:39:21 crc kubenswrapper[4792]: I0301 09:39:21.771825 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 01 09:39:21 crc kubenswrapper[4792]: I0301 09:39:21.854707 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-27xxx" event={"ID":"b5ccb279-c8b2-4288-9072-1175061be204","Type":"ContainerStarted","Data":"1bbb3c139b7d53c7d463ca62df59c4f4dd38f92de5cd1ea315a5ed35bc41088b"} Mar 01 09:39:22 crc kubenswrapper[4792]: I0301 09:39:22.869974 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-27xxx" event={"ID":"b5ccb279-c8b2-4288-9072-1175061be204","Type":"ContainerStarted","Data":"6ad37eb5a9ce8285310a5d61f804630e8b0a3954519985d12a0d71d52e93d217"} Mar 01 09:39:22 crc kubenswrapper[4792]: I0301 09:39:22.897050 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-27xxx" podStartSLOduration=2.423246503 podStartE2EDuration="2.897033189s" podCreationTimestamp="2026-03-01 09:39:20 +0000 UTC" firstStartedPulling="2026-03-01 09:39:21.771532701 +0000 UTC m=+1891.013411908" lastFinishedPulling="2026-03-01 09:39:22.245319397 +0000 UTC m=+1891.487198594" observedRunningTime="2026-03-01 09:39:22.887858826 +0000 UTC m=+1892.129738043" watchObservedRunningTime="2026-03-01 09:39:22.897033189 +0000 UTC m=+1892.138912386" Mar 01 09:39:28 crc kubenswrapper[4792]: I0301 09:39:28.408890 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:39:28 crc kubenswrapper[4792]: E0301 09:39:28.409510 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:39:28 crc kubenswrapper[4792]: I0301 09:39:28.916626 4792 generic.go:334] "Generic (PLEG): container finished" podID="b5ccb279-c8b2-4288-9072-1175061be204" containerID="6ad37eb5a9ce8285310a5d61f804630e8b0a3954519985d12a0d71d52e93d217" exitCode=0 Mar 01 09:39:28 crc kubenswrapper[4792]: I0301 09:39:28.916688 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-27xxx" event={"ID":"b5ccb279-c8b2-4288-9072-1175061be204","Type":"ContainerDied","Data":"6ad37eb5a9ce8285310a5d61f804630e8b0a3954519985d12a0d71d52e93d217"} Mar 01 09:39:29 crc kubenswrapper[4792]: I0301 09:39:29.215746 4792 scope.go:117] "RemoveContainer" containerID="5753e582a896b76584d26c8b6fbaf1b0c86841fe9960e87056d0ee4ab735dcee" Mar 01 09:39:29 crc kubenswrapper[4792]: I0301 09:39:29.264378 4792 scope.go:117] "RemoveContainer" containerID="eb8be93bccd98a25e35b0b04ca4b752b359f9eaa5d34f76412ea01464dd8c3f9" Mar 01 09:39:29 crc kubenswrapper[4792]: I0301 09:39:29.288960 4792 scope.go:117] "RemoveContainer" containerID="4a5e793bcbd54f67d2aa56894763cca0ce1c06ab0ab5c25152dbd8e3b2985066" Mar 01 09:39:29 crc kubenswrapper[4792]: I0301 09:39:29.330431 4792 scope.go:117] "RemoveContainer" containerID="fded697acef2e4939680efd28c30dd0707c1b449f6152a36c981b92695845052" Mar 01 09:39:30 crc kubenswrapper[4792]: I0301 09:39:30.295420 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-27xxx" Mar 01 09:39:30 crc kubenswrapper[4792]: I0301 09:39:30.346546 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8whbm\" (UniqueName: \"kubernetes.io/projected/b5ccb279-c8b2-4288-9072-1175061be204-kube-api-access-8whbm\") pod \"b5ccb279-c8b2-4288-9072-1175061be204\" (UID: \"b5ccb279-c8b2-4288-9072-1175061be204\") " Mar 01 09:39:30 crc kubenswrapper[4792]: I0301 09:39:30.348715 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b5ccb279-c8b2-4288-9072-1175061be204-inventory-0\") pod \"b5ccb279-c8b2-4288-9072-1175061be204\" (UID: \"b5ccb279-c8b2-4288-9072-1175061be204\") " Mar 01 09:39:30 crc kubenswrapper[4792]: I0301 09:39:30.348797 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b5ccb279-c8b2-4288-9072-1175061be204-ssh-key-openstack-edpm-ipam\") pod \"b5ccb279-c8b2-4288-9072-1175061be204\" (UID: \"b5ccb279-c8b2-4288-9072-1175061be204\") " Mar 01 09:39:30 crc kubenswrapper[4792]: I0301 09:39:30.367802 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5ccb279-c8b2-4288-9072-1175061be204-kube-api-access-8whbm" (OuterVolumeSpecName: "kube-api-access-8whbm") pod "b5ccb279-c8b2-4288-9072-1175061be204" (UID: "b5ccb279-c8b2-4288-9072-1175061be204"). InnerVolumeSpecName "kube-api-access-8whbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:39:30 crc kubenswrapper[4792]: I0301 09:39:30.380846 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5ccb279-c8b2-4288-9072-1175061be204-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "b5ccb279-c8b2-4288-9072-1175061be204" (UID: "b5ccb279-c8b2-4288-9072-1175061be204"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:39:30 crc kubenswrapper[4792]: I0301 09:39:30.401209 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5ccb279-c8b2-4288-9072-1175061be204-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b5ccb279-c8b2-4288-9072-1175061be204" (UID: "b5ccb279-c8b2-4288-9072-1175061be204"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:39:30 crc kubenswrapper[4792]: I0301 09:39:30.451311 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8whbm\" (UniqueName: \"kubernetes.io/projected/b5ccb279-c8b2-4288-9072-1175061be204-kube-api-access-8whbm\") on node \"crc\" DevicePath \"\"" Mar 01 09:39:30 crc kubenswrapper[4792]: I0301 09:39:30.452872 4792 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b5ccb279-c8b2-4288-9072-1175061be204-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 01 09:39:30 crc kubenswrapper[4792]: I0301 09:39:30.453013 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b5ccb279-c8b2-4288-9072-1175061be204-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:39:30 crc kubenswrapper[4792]: I0301 09:39:30.933084 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-27xxx" event={"ID":"b5ccb279-c8b2-4288-9072-1175061be204","Type":"ContainerDied","Data":"1bbb3c139b7d53c7d463ca62df59c4f4dd38f92de5cd1ea315a5ed35bc41088b"} Mar 01 09:39:30 crc kubenswrapper[4792]: I0301 09:39:30.933406 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bbb3c139b7d53c7d463ca62df59c4f4dd38f92de5cd1ea315a5ed35bc41088b" Mar 01 09:39:30 crc kubenswrapper[4792]: I0301 09:39:30.933126 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-27xxx" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.007108 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5"] Mar 01 09:39:31 crc kubenswrapper[4792]: E0301 09:39:31.007662 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5ccb279-c8b2-4288-9072-1175061be204" containerName="ssh-known-hosts-edpm-deployment" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.007728 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5ccb279-c8b2-4288-9072-1175061be204" containerName="ssh-known-hosts-edpm-deployment" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.008010 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5ccb279-c8b2-4288-9072-1175061be204" containerName="ssh-known-hosts-edpm-deployment" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.008692 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.015173 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.015323 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.015704 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.015885 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.031505 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5"] Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.063775 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c6429ad-21d8-4f58-900b-e5f6fe4d603d-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-j8zd5\" (UID: \"5c6429ad-21d8-4f58-900b-e5f6fe4d603d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.064022 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5c6429ad-21d8-4f58-900b-e5f6fe4d603d-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-j8zd5\" (UID: \"5c6429ad-21d8-4f58-900b-e5f6fe4d603d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.064089 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99hfr\" (UniqueName: \"kubernetes.io/projected/5c6429ad-21d8-4f58-900b-e5f6fe4d603d-kube-api-access-99hfr\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-j8zd5\" (UID: \"5c6429ad-21d8-4f58-900b-e5f6fe4d603d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.166734 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c6429ad-21d8-4f58-900b-e5f6fe4d603d-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-j8zd5\" (UID: \"5c6429ad-21d8-4f58-900b-e5f6fe4d603d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.166829 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5c6429ad-21d8-4f58-900b-e5f6fe4d603d-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-j8zd5\" (UID: \"5c6429ad-21d8-4f58-900b-e5f6fe4d603d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.166940 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99hfr\" (UniqueName: \"kubernetes.io/projected/5c6429ad-21d8-4f58-900b-e5f6fe4d603d-kube-api-access-99hfr\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-j8zd5\" (UID: \"5c6429ad-21d8-4f58-900b-e5f6fe4d603d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.171305 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5c6429ad-21d8-4f58-900b-e5f6fe4d603d-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-j8zd5\" (UID: \"5c6429ad-21d8-4f58-900b-e5f6fe4d603d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.175372 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c6429ad-21d8-4f58-900b-e5f6fe4d603d-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-j8zd5\" (UID: \"5c6429ad-21d8-4f58-900b-e5f6fe4d603d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.191632 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99hfr\" (UniqueName: \"kubernetes.io/projected/5c6429ad-21d8-4f58-900b-e5f6fe4d603d-kube-api-access-99hfr\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-j8zd5\" (UID: \"5c6429ad-21d8-4f58-900b-e5f6fe4d603d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.333579 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5" Mar 01 09:39:31 crc kubenswrapper[4792]: I0301 09:39:31.939566 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5"] Mar 01 09:39:32 crc kubenswrapper[4792]: I0301 09:39:32.953860 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5" event={"ID":"5c6429ad-21d8-4f58-900b-e5f6fe4d603d","Type":"ContainerStarted","Data":"4100c168a5febe541b2b6fdd770ebafed4e19b504bd5f0104ce3f530de9d8c6d"} Mar 01 09:39:32 crc kubenswrapper[4792]: I0301 09:39:32.954201 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5" event={"ID":"5c6429ad-21d8-4f58-900b-e5f6fe4d603d","Type":"ContainerStarted","Data":"fb07e1f6288a759448ea11b2782889c47aeb90f77ddbe7b61acd663c1b9e7723"} Mar 01 09:39:32 crc kubenswrapper[4792]: I0301 09:39:32.969251 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5" podStartSLOduration=2.5538495020000003 podStartE2EDuration="2.969234086s" podCreationTimestamp="2026-03-01 09:39:30 +0000 UTC" firstStartedPulling="2026-03-01 09:39:31.950021955 +0000 UTC m=+1901.191901162" lastFinishedPulling="2026-03-01 09:39:32.365406549 +0000 UTC m=+1901.607285746" observedRunningTime="2026-03-01 09:39:32.966954598 +0000 UTC m=+1902.208833795" watchObservedRunningTime="2026-03-01 09:39:32.969234086 +0000 UTC m=+1902.211113283" Mar 01 09:39:35 crc kubenswrapper[4792]: I0301 09:39:35.047214 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-3b7a-account-create-update-2vc26"] Mar 01 09:39:35 crc kubenswrapper[4792]: I0301 09:39:35.054055 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-3b7a-account-create-update-2vc26"] Mar 01 09:39:35 crc kubenswrapper[4792]: I0301 09:39:35.421806 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2" path="/var/lib/kubelet/pods/e6cd5ba6-1b42-4c95-b84a-0a00aa9a65c2/volumes" Mar 01 09:39:36 crc kubenswrapper[4792]: I0301 09:39:36.047541 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-zj224"] Mar 01 09:39:36 crc kubenswrapper[4792]: I0301 09:39:36.059522 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-8f2r2"] Mar 01 09:39:36 crc kubenswrapper[4792]: I0301 09:39:36.081949 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-qt42r"] Mar 01 09:39:36 crc kubenswrapper[4792]: I0301 09:39:36.094993 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-c72f-account-create-update-x9vvj"] Mar 01 09:39:36 crc kubenswrapper[4792]: I0301 09:39:36.104229 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-474a-account-create-update-dlgkl"] Mar 01 09:39:36 crc kubenswrapper[4792]: I0301 09:39:36.113141 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-zj224"] Mar 01 09:39:36 crc kubenswrapper[4792]: I0301 09:39:36.121652 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-8f2r2"] Mar 01 09:39:36 crc kubenswrapper[4792]: I0301 09:39:36.128887 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-474a-account-create-update-dlgkl"] Mar 01 09:39:36 crc kubenswrapper[4792]: I0301 09:39:36.136606 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-qt42r"] Mar 01 09:39:36 crc kubenswrapper[4792]: I0301 09:39:36.150476 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-c72f-account-create-update-x9vvj"] Mar 01 09:39:37 crc kubenswrapper[4792]: I0301 09:39:37.421662 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09b4c86e-31ba-4d91-a602-39fa3a57c798" path="/var/lib/kubelet/pods/09b4c86e-31ba-4d91-a602-39fa3a57c798/volumes" Mar 01 09:39:37 crc kubenswrapper[4792]: I0301 09:39:37.423127 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21b0442e-f4b4-4f59-b3c5-1510ae4d792c" path="/var/lib/kubelet/pods/21b0442e-f4b4-4f59-b3c5-1510ae4d792c/volumes" Mar 01 09:39:37 crc kubenswrapper[4792]: I0301 09:39:37.423678 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a069955e-f546-4522-97ec-5a529f79b1aa" path="/var/lib/kubelet/pods/a069955e-f546-4522-97ec-5a529f79b1aa/volumes" Mar 01 09:39:37 crc kubenswrapper[4792]: I0301 09:39:37.424257 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe" path="/var/lib/kubelet/pods/f28ebfd5-929d-49ad-bfcd-82aa39d0d2fe/volumes" Mar 01 09:39:37 crc kubenswrapper[4792]: I0301 09:39:37.425484 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2be4f49-c20a-4e25-bff3-e4617d275fa1" path="/var/lib/kubelet/pods/f2be4f49-c20a-4e25-bff3-e4617d275fa1/volumes" Mar 01 09:39:39 crc kubenswrapper[4792]: I0301 09:39:39.409126 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:39:39 crc kubenswrapper[4792]: E0301 09:39:39.409764 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:39:41 crc kubenswrapper[4792]: I0301 09:39:41.010855 4792 generic.go:334] "Generic (PLEG): container finished" podID="5c6429ad-21d8-4f58-900b-e5f6fe4d603d" containerID="4100c168a5febe541b2b6fdd770ebafed4e19b504bd5f0104ce3f530de9d8c6d" exitCode=0 Mar 01 09:39:41 crc kubenswrapper[4792]: I0301 09:39:41.010925 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5" event={"ID":"5c6429ad-21d8-4f58-900b-e5f6fe4d603d","Type":"ContainerDied","Data":"4100c168a5febe541b2b6fdd770ebafed4e19b504bd5f0104ce3f530de9d8c6d"} Mar 01 09:39:42 crc kubenswrapper[4792]: I0301 09:39:42.498520 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5" Mar 01 09:39:42 crc kubenswrapper[4792]: I0301 09:39:42.672640 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5c6429ad-21d8-4f58-900b-e5f6fe4d603d-ssh-key-openstack-edpm-ipam\") pod \"5c6429ad-21d8-4f58-900b-e5f6fe4d603d\" (UID: \"5c6429ad-21d8-4f58-900b-e5f6fe4d603d\") " Mar 01 09:39:42 crc kubenswrapper[4792]: I0301 09:39:42.672685 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99hfr\" (UniqueName: \"kubernetes.io/projected/5c6429ad-21d8-4f58-900b-e5f6fe4d603d-kube-api-access-99hfr\") pod \"5c6429ad-21d8-4f58-900b-e5f6fe4d603d\" (UID: \"5c6429ad-21d8-4f58-900b-e5f6fe4d603d\") " Mar 01 09:39:42 crc kubenswrapper[4792]: I0301 09:39:42.672791 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c6429ad-21d8-4f58-900b-e5f6fe4d603d-inventory\") pod \"5c6429ad-21d8-4f58-900b-e5f6fe4d603d\" (UID: \"5c6429ad-21d8-4f58-900b-e5f6fe4d603d\") " Mar 01 09:39:42 crc kubenswrapper[4792]: I0301 09:39:42.678415 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c6429ad-21d8-4f58-900b-e5f6fe4d603d-kube-api-access-99hfr" (OuterVolumeSpecName: "kube-api-access-99hfr") pod "5c6429ad-21d8-4f58-900b-e5f6fe4d603d" (UID: "5c6429ad-21d8-4f58-900b-e5f6fe4d603d"). InnerVolumeSpecName "kube-api-access-99hfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:39:42 crc kubenswrapper[4792]: I0301 09:39:42.698150 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c6429ad-21d8-4f58-900b-e5f6fe4d603d-inventory" (OuterVolumeSpecName: "inventory") pod "5c6429ad-21d8-4f58-900b-e5f6fe4d603d" (UID: "5c6429ad-21d8-4f58-900b-e5f6fe4d603d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:39:42 crc kubenswrapper[4792]: I0301 09:39:42.703798 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c6429ad-21d8-4f58-900b-e5f6fe4d603d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5c6429ad-21d8-4f58-900b-e5f6fe4d603d" (UID: "5c6429ad-21d8-4f58-900b-e5f6fe4d603d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:39:42 crc kubenswrapper[4792]: I0301 09:39:42.774054 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5c6429ad-21d8-4f58-900b-e5f6fe4d603d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:39:42 crc kubenswrapper[4792]: I0301 09:39:42.774084 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99hfr\" (UniqueName: \"kubernetes.io/projected/5c6429ad-21d8-4f58-900b-e5f6fe4d603d-kube-api-access-99hfr\") on node \"crc\" DevicePath \"\"" Mar 01 09:39:42 crc kubenswrapper[4792]: I0301 09:39:42.774095 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c6429ad-21d8-4f58-900b-e5f6fe4d603d-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.032754 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5" event={"ID":"5c6429ad-21d8-4f58-900b-e5f6fe4d603d","Type":"ContainerDied","Data":"fb07e1f6288a759448ea11b2782889c47aeb90f77ddbe7b61acd663c1b9e7723"} Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.032798 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb07e1f6288a759448ea11b2782889c47aeb90f77ddbe7b61acd663c1b9e7723" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.032821 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.200005 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh"] Mar 01 09:39:43 crc kubenswrapper[4792]: E0301 09:39:43.200371 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c6429ad-21d8-4f58-900b-e5f6fe4d603d" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.200392 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c6429ad-21d8-4f58-900b-e5f6fe4d603d" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.200564 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c6429ad-21d8-4f58-900b-e5f6fe4d603d" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.201122 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.204573 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.204596 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.204840 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.204887 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.218847 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh"] Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.384228 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91d95c97-b82e-413c-b05a-3e9cb36e504e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh\" (UID: \"91d95c97-b82e-413c-b05a-3e9cb36e504e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.384864 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk765\" (UniqueName: \"kubernetes.io/projected/91d95c97-b82e-413c-b05a-3e9cb36e504e-kube-api-access-nk765\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh\" (UID: \"91d95c97-b82e-413c-b05a-3e9cb36e504e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.385002 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/91d95c97-b82e-413c-b05a-3e9cb36e504e-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh\" (UID: \"91d95c97-b82e-413c-b05a-3e9cb36e504e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.486996 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91d95c97-b82e-413c-b05a-3e9cb36e504e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh\" (UID: \"91d95c97-b82e-413c-b05a-3e9cb36e504e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.487866 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk765\" (UniqueName: \"kubernetes.io/projected/91d95c97-b82e-413c-b05a-3e9cb36e504e-kube-api-access-nk765\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh\" (UID: \"91d95c97-b82e-413c-b05a-3e9cb36e504e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.487962 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/91d95c97-b82e-413c-b05a-3e9cb36e504e-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh\" (UID: \"91d95c97-b82e-413c-b05a-3e9cb36e504e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.490680 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91d95c97-b82e-413c-b05a-3e9cb36e504e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh\" (UID: \"91d95c97-b82e-413c-b05a-3e9cb36e504e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.493553 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/91d95c97-b82e-413c-b05a-3e9cb36e504e-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh\" (UID: \"91d95c97-b82e-413c-b05a-3e9cb36e504e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.504390 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk765\" (UniqueName: \"kubernetes.io/projected/91d95c97-b82e-413c-b05a-3e9cb36e504e-kube-api-access-nk765\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh\" (UID: \"91d95c97-b82e-413c-b05a-3e9cb36e504e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh" Mar 01 09:39:43 crc kubenswrapper[4792]: I0301 09:39:43.517639 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh" Mar 01 09:39:44 crc kubenswrapper[4792]: I0301 09:39:44.014876 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh"] Mar 01 09:39:44 crc kubenswrapper[4792]: I0301 09:39:44.042755 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh" event={"ID":"91d95c97-b82e-413c-b05a-3e9cb36e504e","Type":"ContainerStarted","Data":"30bb0ac3c1d1f68b2a280b73be598d2147d1223e5f8cd999679a757d2ca28d20"} Mar 01 09:39:45 crc kubenswrapper[4792]: I0301 09:39:45.050759 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh" event={"ID":"91d95c97-b82e-413c-b05a-3e9cb36e504e","Type":"ContainerStarted","Data":"c7d695ec4b923c131de05f4b3929cfdf1d810db81219d72b727b76b052713eaf"} Mar 01 09:39:45 crc kubenswrapper[4792]: I0301 09:39:45.065160 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh" podStartSLOduration=1.672184384 podStartE2EDuration="2.065145879s" podCreationTimestamp="2026-03-01 09:39:43 +0000 UTC" firstStartedPulling="2026-03-01 09:39:44.019857487 +0000 UTC m=+1913.261736684" lastFinishedPulling="2026-03-01 09:39:44.412818982 +0000 UTC m=+1913.654698179" observedRunningTime="2026-03-01 09:39:45.061832615 +0000 UTC m=+1914.303711802" watchObservedRunningTime="2026-03-01 09:39:45.065145879 +0000 UTC m=+1914.307025076" Mar 01 09:39:54 crc kubenswrapper[4792]: I0301 09:39:54.182255 4792 generic.go:334] "Generic (PLEG): container finished" podID="91d95c97-b82e-413c-b05a-3e9cb36e504e" containerID="c7d695ec4b923c131de05f4b3929cfdf1d810db81219d72b727b76b052713eaf" exitCode=0 Mar 01 09:39:54 crc kubenswrapper[4792]: I0301 09:39:54.182325 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh" event={"ID":"91d95c97-b82e-413c-b05a-3e9cb36e504e","Type":"ContainerDied","Data":"c7d695ec4b923c131de05f4b3929cfdf1d810db81219d72b727b76b052713eaf"} Mar 01 09:39:54 crc kubenswrapper[4792]: I0301 09:39:54.409176 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:39:54 crc kubenswrapper[4792]: E0301 09:39:54.409884 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:39:55 crc kubenswrapper[4792]: I0301 09:39:55.610274 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh" Mar 01 09:39:55 crc kubenswrapper[4792]: I0301 09:39:55.669284 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91d95c97-b82e-413c-b05a-3e9cb36e504e-inventory\") pod \"91d95c97-b82e-413c-b05a-3e9cb36e504e\" (UID: \"91d95c97-b82e-413c-b05a-3e9cb36e504e\") " Mar 01 09:39:55 crc kubenswrapper[4792]: I0301 09:39:55.669502 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/91d95c97-b82e-413c-b05a-3e9cb36e504e-ssh-key-openstack-edpm-ipam\") pod \"91d95c97-b82e-413c-b05a-3e9cb36e504e\" (UID: \"91d95c97-b82e-413c-b05a-3e9cb36e504e\") " Mar 01 09:39:55 crc kubenswrapper[4792]: I0301 09:39:55.669563 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nk765\" (UniqueName: \"kubernetes.io/projected/91d95c97-b82e-413c-b05a-3e9cb36e504e-kube-api-access-nk765\") pod \"91d95c97-b82e-413c-b05a-3e9cb36e504e\" (UID: \"91d95c97-b82e-413c-b05a-3e9cb36e504e\") " Mar 01 09:39:55 crc kubenswrapper[4792]: I0301 09:39:55.678170 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91d95c97-b82e-413c-b05a-3e9cb36e504e-kube-api-access-nk765" (OuterVolumeSpecName: "kube-api-access-nk765") pod "91d95c97-b82e-413c-b05a-3e9cb36e504e" (UID: "91d95c97-b82e-413c-b05a-3e9cb36e504e"). InnerVolumeSpecName "kube-api-access-nk765". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:39:55 crc kubenswrapper[4792]: I0301 09:39:55.699540 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91d95c97-b82e-413c-b05a-3e9cb36e504e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "91d95c97-b82e-413c-b05a-3e9cb36e504e" (UID: "91d95c97-b82e-413c-b05a-3e9cb36e504e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:39:55 crc kubenswrapper[4792]: I0301 09:39:55.702158 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91d95c97-b82e-413c-b05a-3e9cb36e504e-inventory" (OuterVolumeSpecName: "inventory") pod "91d95c97-b82e-413c-b05a-3e9cb36e504e" (UID: "91d95c97-b82e-413c-b05a-3e9cb36e504e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:39:55 crc kubenswrapper[4792]: I0301 09:39:55.771744 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91d95c97-b82e-413c-b05a-3e9cb36e504e-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:39:55 crc kubenswrapper[4792]: I0301 09:39:55.771787 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/91d95c97-b82e-413c-b05a-3e9cb36e504e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:39:55 crc kubenswrapper[4792]: I0301 09:39:55.771801 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nk765\" (UniqueName: \"kubernetes.io/projected/91d95c97-b82e-413c-b05a-3e9cb36e504e-kube-api-access-nk765\") on node \"crc\" DevicePath \"\"" Mar 01 09:39:56 crc kubenswrapper[4792]: I0301 09:39:56.201697 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh" event={"ID":"91d95c97-b82e-413c-b05a-3e9cb36e504e","Type":"ContainerDied","Data":"30bb0ac3c1d1f68b2a280b73be598d2147d1223e5f8cd999679a757d2ca28d20"} Mar 01 09:39:56 crc kubenswrapper[4792]: I0301 09:39:56.201736 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30bb0ac3c1d1f68b2a280b73be598d2147d1223e5f8cd999679a757d2ca28d20" Mar 01 09:39:56 crc kubenswrapper[4792]: I0301 09:39:56.201758 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh" Mar 01 09:40:00 crc kubenswrapper[4792]: I0301 09:40:00.137730 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539300-ws5xb"] Mar 01 09:40:00 crc kubenswrapper[4792]: E0301 09:40:00.138833 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91d95c97-b82e-413c-b05a-3e9cb36e504e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:40:00 crc kubenswrapper[4792]: I0301 09:40:00.138853 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="91d95c97-b82e-413c-b05a-3e9cb36e504e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:40:00 crc kubenswrapper[4792]: I0301 09:40:00.139146 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="91d95c97-b82e-413c-b05a-3e9cb36e504e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:40:00 crc kubenswrapper[4792]: I0301 09:40:00.139808 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539300-ws5xb" Mar 01 09:40:00 crc kubenswrapper[4792]: I0301 09:40:00.143090 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:40:00 crc kubenswrapper[4792]: I0301 09:40:00.143422 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:40:00 crc kubenswrapper[4792]: I0301 09:40:00.143422 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:40:00 crc kubenswrapper[4792]: I0301 09:40:00.150135 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539300-ws5xb"] Mar 01 09:40:00 crc kubenswrapper[4792]: I0301 09:40:00.271066 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hmql\" (UniqueName: \"kubernetes.io/projected/79584910-9524-4e1c-8edf-5411aa71eb0a-kube-api-access-6hmql\") pod \"auto-csr-approver-29539300-ws5xb\" (UID: \"79584910-9524-4e1c-8edf-5411aa71eb0a\") " pod="openshift-infra/auto-csr-approver-29539300-ws5xb" Mar 01 09:40:00 crc kubenswrapper[4792]: I0301 09:40:00.373431 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hmql\" (UniqueName: \"kubernetes.io/projected/79584910-9524-4e1c-8edf-5411aa71eb0a-kube-api-access-6hmql\") pod \"auto-csr-approver-29539300-ws5xb\" (UID: \"79584910-9524-4e1c-8edf-5411aa71eb0a\") " pod="openshift-infra/auto-csr-approver-29539300-ws5xb" Mar 01 09:40:00 crc kubenswrapper[4792]: I0301 09:40:00.393923 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hmql\" (UniqueName: \"kubernetes.io/projected/79584910-9524-4e1c-8edf-5411aa71eb0a-kube-api-access-6hmql\") pod \"auto-csr-approver-29539300-ws5xb\" (UID: \"79584910-9524-4e1c-8edf-5411aa71eb0a\") " pod="openshift-infra/auto-csr-approver-29539300-ws5xb" Mar 01 09:40:00 crc kubenswrapper[4792]: I0301 09:40:00.464043 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539300-ws5xb" Mar 01 09:40:00 crc kubenswrapper[4792]: I0301 09:40:00.908381 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539300-ws5xb"] Mar 01 09:40:01 crc kubenswrapper[4792]: I0301 09:40:01.272578 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539300-ws5xb" event={"ID":"79584910-9524-4e1c-8edf-5411aa71eb0a","Type":"ContainerStarted","Data":"7de1a92d2ee09022176111af02588639b0d73b7ea1861e3604c90244c0b0692c"} Mar 01 09:40:02 crc kubenswrapper[4792]: I0301 09:40:02.281343 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539300-ws5xb" event={"ID":"79584910-9524-4e1c-8edf-5411aa71eb0a","Type":"ContainerStarted","Data":"8579b82c3f2aeab429db244bb7b4d62bd406e57babe3af839bf5e91664e2433c"} Mar 01 09:40:02 crc kubenswrapper[4792]: I0301 09:40:02.296776 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29539300-ws5xb" podStartSLOduration=1.372850259 podStartE2EDuration="2.29675851s" podCreationTimestamp="2026-03-01 09:40:00 +0000 UTC" firstStartedPulling="2026-03-01 09:40:00.913297914 +0000 UTC m=+1930.155177111" lastFinishedPulling="2026-03-01 09:40:01.837206165 +0000 UTC m=+1931.079085362" observedRunningTime="2026-03-01 09:40:02.293284412 +0000 UTC m=+1931.535163609" watchObservedRunningTime="2026-03-01 09:40:02.29675851 +0000 UTC m=+1931.538637707" Mar 01 09:40:03 crc kubenswrapper[4792]: I0301 09:40:03.292041 4792 generic.go:334] "Generic (PLEG): container finished" podID="79584910-9524-4e1c-8edf-5411aa71eb0a" containerID="8579b82c3f2aeab429db244bb7b4d62bd406e57babe3af839bf5e91664e2433c" exitCode=0 Mar 01 09:40:03 crc kubenswrapper[4792]: I0301 09:40:03.292278 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539300-ws5xb" event={"ID":"79584910-9524-4e1c-8edf-5411aa71eb0a","Type":"ContainerDied","Data":"8579b82c3f2aeab429db244bb7b4d62bd406e57babe3af839bf5e91664e2433c"} Mar 01 09:40:04 crc kubenswrapper[4792]: I0301 09:40:04.596292 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539300-ws5xb" Mar 01 09:40:04 crc kubenswrapper[4792]: I0301 09:40:04.748166 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hmql\" (UniqueName: \"kubernetes.io/projected/79584910-9524-4e1c-8edf-5411aa71eb0a-kube-api-access-6hmql\") pod \"79584910-9524-4e1c-8edf-5411aa71eb0a\" (UID: \"79584910-9524-4e1c-8edf-5411aa71eb0a\") " Mar 01 09:40:04 crc kubenswrapper[4792]: I0301 09:40:04.755239 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79584910-9524-4e1c-8edf-5411aa71eb0a-kube-api-access-6hmql" (OuterVolumeSpecName: "kube-api-access-6hmql") pod "79584910-9524-4e1c-8edf-5411aa71eb0a" (UID: "79584910-9524-4e1c-8edf-5411aa71eb0a"). InnerVolumeSpecName "kube-api-access-6hmql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:40:04 crc kubenswrapper[4792]: I0301 09:40:04.849894 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hmql\" (UniqueName: \"kubernetes.io/projected/79584910-9524-4e1c-8edf-5411aa71eb0a-kube-api-access-6hmql\") on node \"crc\" DevicePath \"\"" Mar 01 09:40:05 crc kubenswrapper[4792]: I0301 09:40:05.036553 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tp5l7"] Mar 01 09:40:05 crc kubenswrapper[4792]: I0301 09:40:05.043840 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tp5l7"] Mar 01 09:40:05 crc kubenswrapper[4792]: I0301 09:40:05.308369 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539300-ws5xb" event={"ID":"79584910-9524-4e1c-8edf-5411aa71eb0a","Type":"ContainerDied","Data":"7de1a92d2ee09022176111af02588639b0d73b7ea1861e3604c90244c0b0692c"} Mar 01 09:40:05 crc kubenswrapper[4792]: I0301 09:40:05.308408 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7de1a92d2ee09022176111af02588639b0d73b7ea1861e3604c90244c0b0692c" Mar 01 09:40:05 crc kubenswrapper[4792]: I0301 09:40:05.308458 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539300-ws5xb" Mar 01 09:40:05 crc kubenswrapper[4792]: I0301 09:40:05.423245 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de6ead5c-face-41ff-ab6e-aebb7ca73c1c" path="/var/lib/kubelet/pods/de6ead5c-face-41ff-ab6e-aebb7ca73c1c/volumes" Mar 01 09:40:05 crc kubenswrapper[4792]: I0301 09:40:05.654854 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539294-5jtlh"] Mar 01 09:40:05 crc kubenswrapper[4792]: I0301 09:40:05.663288 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539294-5jtlh"] Mar 01 09:40:07 crc kubenswrapper[4792]: I0301 09:40:07.408626 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:40:07 crc kubenswrapper[4792]: E0301 09:40:07.408923 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:40:07 crc kubenswrapper[4792]: I0301 09:40:07.419171 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6725e35-5100-4360-85ca-00aad33007d4" path="/var/lib/kubelet/pods/a6725e35-5100-4360-85ca-00aad33007d4/volumes" Mar 01 09:40:19 crc kubenswrapper[4792]: I0301 09:40:19.408881 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:40:19 crc kubenswrapper[4792]: E0301 09:40:19.410706 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:40:24 crc kubenswrapper[4792]: I0301 09:40:24.034046 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-8vfwt"] Mar 01 09:40:24 crc kubenswrapper[4792]: I0301 09:40:24.043937 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-8vfwt"] Mar 01 09:40:25 crc kubenswrapper[4792]: I0301 09:40:25.081215 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tjd85"] Mar 01 09:40:25 crc kubenswrapper[4792]: I0301 09:40:25.098914 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tjd85"] Mar 01 09:40:25 crc kubenswrapper[4792]: I0301 09:40:25.424009 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32a84376-7418-49cd-9c62-fdd1af7ec31b" path="/var/lib/kubelet/pods/32a84376-7418-49cd-9c62-fdd1af7ec31b/volumes" Mar 01 09:40:25 crc kubenswrapper[4792]: I0301 09:40:25.424549 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7269b8b7-440f-4fae-b0f1-f624e9d5b29a" path="/var/lib/kubelet/pods/7269b8b7-440f-4fae-b0f1-f624e9d5b29a/volumes" Mar 01 09:40:29 crc kubenswrapper[4792]: I0301 09:40:29.452889 4792 scope.go:117] "RemoveContainer" containerID="05d5887b441a9b375453d0ad6f9bd8826e5d3d116043c948abf3e299df007d6e" Mar 01 09:40:29 crc kubenswrapper[4792]: I0301 09:40:29.473952 4792 scope.go:117] "RemoveContainer" containerID="e107bd42472fceec66462b44aaa6f7f47fb07e9eba8ac8e30bec4fee69d4eff3" Mar 01 09:40:29 crc kubenswrapper[4792]: I0301 09:40:29.547264 4792 scope.go:117] "RemoveContainer" containerID="7b120b9d05aec1bbdb715a0cec1430208c27b75e09d6308e245c67d773de0e22" Mar 01 09:40:29 crc kubenswrapper[4792]: I0301 09:40:29.596384 4792 scope.go:117] "RemoveContainer" containerID="cc49a2b9acb35bd4588c6c9cb6d10085e66d67e1476ad98c515c87fcc8a40be2" Mar 01 09:40:29 crc kubenswrapper[4792]: I0301 09:40:29.643935 4792 scope.go:117] "RemoveContainer" containerID="d36c7b1a79f4c4f6b03c61a96da47d8288af453f4f9225ccf2c3d4099f44d0df" Mar 01 09:40:29 crc kubenswrapper[4792]: I0301 09:40:29.664591 4792 scope.go:117] "RemoveContainer" containerID="7a8d1321567d66f2ccb1955a5edf06d8800b1b50205fae01644b45e7fa573653" Mar 01 09:40:29 crc kubenswrapper[4792]: I0301 09:40:29.699085 4792 scope.go:117] "RemoveContainer" containerID="82dc2fec75535cf7d5ba98c257213dcf0b978f1770576dc67153fee7dc3473af" Mar 01 09:40:29 crc kubenswrapper[4792]: I0301 09:40:29.721340 4792 scope.go:117] "RemoveContainer" containerID="92439593a89d55067268c29652937630710da016f1c0b75141189d39ceeced86" Mar 01 09:40:29 crc kubenswrapper[4792]: I0301 09:40:29.741469 4792 scope.go:117] "RemoveContainer" containerID="dffb4e0c554e65ca661a311e46e8c09863ee0d9c8fb1783c221a0323e637a5f5" Mar 01 09:40:29 crc kubenswrapper[4792]: I0301 09:40:29.762437 4792 scope.go:117] "RemoveContainer" containerID="b0e1e09f850992f4661d6be2a0a76260a157f0e5c0875fd88ff0bc92644a8d13" Mar 01 09:40:34 crc kubenswrapper[4792]: I0301 09:40:34.409140 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:40:34 crc kubenswrapper[4792]: E0301 09:40:34.409717 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:40:49 crc kubenswrapper[4792]: I0301 09:40:49.408777 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:40:49 crc kubenswrapper[4792]: E0301 09:40:49.409601 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:41:04 crc kubenswrapper[4792]: I0301 09:41:04.408734 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:41:04 crc kubenswrapper[4792]: E0301 09:41:04.409452 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:41:08 crc kubenswrapper[4792]: I0301 09:41:08.050736 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-6q8nq"] Mar 01 09:41:08 crc kubenswrapper[4792]: I0301 09:41:08.058091 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-6q8nq"] Mar 01 09:41:09 crc kubenswrapper[4792]: I0301 09:41:09.419000 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8782d670-70cd-42cc-b4d7-c0c8275e457b" path="/var/lib/kubelet/pods/8782d670-70cd-42cc-b4d7-c0c8275e457b/volumes" Mar 01 09:41:19 crc kubenswrapper[4792]: I0301 09:41:19.409998 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:41:19 crc kubenswrapper[4792]: E0301 09:41:19.410946 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:41:29 crc kubenswrapper[4792]: I0301 09:41:29.918428 4792 scope.go:117] "RemoveContainer" containerID="c8dd8174a7f29b772205511318ce1d6a20b6f7ef7820849db345c8f1e46e0166" Mar 01 09:41:32 crc kubenswrapper[4792]: I0301 09:41:32.408859 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:41:32 crc kubenswrapper[4792]: E0301 09:41:32.409504 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:41:47 crc kubenswrapper[4792]: I0301 09:41:47.409237 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:41:47 crc kubenswrapper[4792]: E0301 09:41:47.411613 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:42:00 crc kubenswrapper[4792]: I0301 09:42:00.141087 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539302-26gpt"] Mar 01 09:42:00 crc kubenswrapper[4792]: E0301 09:42:00.143556 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79584910-9524-4e1c-8edf-5411aa71eb0a" containerName="oc" Mar 01 09:42:00 crc kubenswrapper[4792]: I0301 09:42:00.143647 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="79584910-9524-4e1c-8edf-5411aa71eb0a" containerName="oc" Mar 01 09:42:00 crc kubenswrapper[4792]: I0301 09:42:00.143952 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="79584910-9524-4e1c-8edf-5411aa71eb0a" containerName="oc" Mar 01 09:42:00 crc kubenswrapper[4792]: I0301 09:42:00.144590 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539302-26gpt" Mar 01 09:42:00 crc kubenswrapper[4792]: I0301 09:42:00.147689 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:42:00 crc kubenswrapper[4792]: I0301 09:42:00.147890 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:42:00 crc kubenswrapper[4792]: I0301 09:42:00.148441 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:42:00 crc kubenswrapper[4792]: I0301 09:42:00.153011 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539302-26gpt"] Mar 01 09:42:00 crc kubenswrapper[4792]: I0301 09:42:00.189020 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkntj\" (UniqueName: \"kubernetes.io/projected/792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5-kube-api-access-vkntj\") pod \"auto-csr-approver-29539302-26gpt\" (UID: \"792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5\") " pod="openshift-infra/auto-csr-approver-29539302-26gpt" Mar 01 09:42:00 crc kubenswrapper[4792]: I0301 09:42:00.291341 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkntj\" (UniqueName: \"kubernetes.io/projected/792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5-kube-api-access-vkntj\") pod \"auto-csr-approver-29539302-26gpt\" (UID: \"792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5\") " pod="openshift-infra/auto-csr-approver-29539302-26gpt" Mar 01 09:42:00 crc kubenswrapper[4792]: I0301 09:42:00.309017 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkntj\" (UniqueName: \"kubernetes.io/projected/792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5-kube-api-access-vkntj\") pod \"auto-csr-approver-29539302-26gpt\" (UID: \"792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5\") " pod="openshift-infra/auto-csr-approver-29539302-26gpt" Mar 01 09:42:00 crc kubenswrapper[4792]: I0301 09:42:00.408679 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:42:00 crc kubenswrapper[4792]: E0301 09:42:00.409069 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:42:00 crc kubenswrapper[4792]: I0301 09:42:00.494390 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539302-26gpt" Mar 01 09:42:00 crc kubenswrapper[4792]: I0301 09:42:00.714541 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539302-26gpt"] Mar 01 09:42:01 crc kubenswrapper[4792]: I0301 09:42:01.395691 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539302-26gpt" event={"ID":"792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5","Type":"ContainerStarted","Data":"1cad343828afa0c050d70d2277cd2b09f0e50afc289e1270ab226b13616a6c34"} Mar 01 09:42:02 crc kubenswrapper[4792]: I0301 09:42:02.404141 4792 generic.go:334] "Generic (PLEG): container finished" podID="792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5" containerID="53fe1a8f0f86c9e965b90816a9566427d372fba1cc22db1d2bb0ca2e72f57708" exitCode=0 Mar 01 09:42:02 crc kubenswrapper[4792]: I0301 09:42:02.404178 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539302-26gpt" event={"ID":"792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5","Type":"ContainerDied","Data":"53fe1a8f0f86c9e965b90816a9566427d372fba1cc22db1d2bb0ca2e72f57708"} Mar 01 09:42:03 crc kubenswrapper[4792]: I0301 09:42:03.718635 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539302-26gpt" Mar 01 09:42:03 crc kubenswrapper[4792]: I0301 09:42:03.855950 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkntj\" (UniqueName: \"kubernetes.io/projected/792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5-kube-api-access-vkntj\") pod \"792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5\" (UID: \"792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5\") " Mar 01 09:42:03 crc kubenswrapper[4792]: I0301 09:42:03.866877 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5-kube-api-access-vkntj" (OuterVolumeSpecName: "kube-api-access-vkntj") pod "792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5" (UID: "792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5"). InnerVolumeSpecName "kube-api-access-vkntj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:42:03 crc kubenswrapper[4792]: I0301 09:42:03.958029 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkntj\" (UniqueName: \"kubernetes.io/projected/792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5-kube-api-access-vkntj\") on node \"crc\" DevicePath \"\"" Mar 01 09:42:04 crc kubenswrapper[4792]: I0301 09:42:04.425755 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539302-26gpt" event={"ID":"792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5","Type":"ContainerDied","Data":"1cad343828afa0c050d70d2277cd2b09f0e50afc289e1270ab226b13616a6c34"} Mar 01 09:42:04 crc kubenswrapper[4792]: I0301 09:42:04.425805 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cad343828afa0c050d70d2277cd2b09f0e50afc289e1270ab226b13616a6c34" Mar 01 09:42:04 crc kubenswrapper[4792]: I0301 09:42:04.425882 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539302-26gpt" Mar 01 09:42:04 crc kubenswrapper[4792]: I0301 09:42:04.788996 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539296-87tnt"] Mar 01 09:42:04 crc kubenswrapper[4792]: I0301 09:42:04.796233 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539296-87tnt"] Mar 01 09:42:05 crc kubenswrapper[4792]: I0301 09:42:05.419282 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0029741-30a3-4fc2-b71d-c77dbd652c35" path="/var/lib/kubelet/pods/f0029741-30a3-4fc2-b71d-c77dbd652c35/volumes" Mar 01 09:42:12 crc kubenswrapper[4792]: I0301 09:42:12.408946 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:42:12 crc kubenswrapper[4792]: E0301 09:42:12.409723 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:42:24 crc kubenswrapper[4792]: I0301 09:42:24.408691 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:42:24 crc kubenswrapper[4792]: E0301 09:42:24.409374 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:42:29 crc kubenswrapper[4792]: I0301 09:42:29.988145 4792 scope.go:117] "RemoveContainer" containerID="8b912a3a88a6f648dd530babff742abe47a9567e05895bab6379ed09d8bc8a56" Mar 01 09:42:37 crc kubenswrapper[4792]: I0301 09:42:37.409066 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:42:37 crc kubenswrapper[4792]: I0301 09:42:37.692877 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"80f7f8ed75933ce29e4ba1caec37158448f83396fe1f5a8bba0233aec8df1ec7"} Mar 01 09:43:14 crc kubenswrapper[4792]: I0301 09:43:14.614148 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m7wjn"] Mar 01 09:43:14 crc kubenswrapper[4792]: E0301 09:43:14.615143 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5" containerName="oc" Mar 01 09:43:14 crc kubenswrapper[4792]: I0301 09:43:14.615158 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5" containerName="oc" Mar 01 09:43:14 crc kubenswrapper[4792]: I0301 09:43:14.615364 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5" containerName="oc" Mar 01 09:43:14 crc kubenswrapper[4792]: I0301 09:43:14.616829 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m7wjn" Mar 01 09:43:14 crc kubenswrapper[4792]: I0301 09:43:14.622685 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m7wjn"] Mar 01 09:43:14 crc kubenswrapper[4792]: I0301 09:43:14.780812 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b6b7\" (UniqueName: \"kubernetes.io/projected/0a8e78a0-a8cd-450d-ad43-bb8060b2111c-kube-api-access-4b6b7\") pod \"redhat-operators-m7wjn\" (UID: \"0a8e78a0-a8cd-450d-ad43-bb8060b2111c\") " pod="openshift-marketplace/redhat-operators-m7wjn" Mar 01 09:43:14 crc kubenswrapper[4792]: I0301 09:43:14.781218 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a8e78a0-a8cd-450d-ad43-bb8060b2111c-catalog-content\") pod \"redhat-operators-m7wjn\" (UID: \"0a8e78a0-a8cd-450d-ad43-bb8060b2111c\") " pod="openshift-marketplace/redhat-operators-m7wjn" Mar 01 09:43:14 crc kubenswrapper[4792]: I0301 09:43:14.781267 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a8e78a0-a8cd-450d-ad43-bb8060b2111c-utilities\") pod \"redhat-operators-m7wjn\" (UID: \"0a8e78a0-a8cd-450d-ad43-bb8060b2111c\") " pod="openshift-marketplace/redhat-operators-m7wjn" Mar 01 09:43:14 crc kubenswrapper[4792]: I0301 09:43:14.883724 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a8e78a0-a8cd-450d-ad43-bb8060b2111c-catalog-content\") pod \"redhat-operators-m7wjn\" (UID: \"0a8e78a0-a8cd-450d-ad43-bb8060b2111c\") " pod="openshift-marketplace/redhat-operators-m7wjn" Mar 01 09:43:14 crc kubenswrapper[4792]: I0301 09:43:14.883794 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a8e78a0-a8cd-450d-ad43-bb8060b2111c-utilities\") pod \"redhat-operators-m7wjn\" (UID: \"0a8e78a0-a8cd-450d-ad43-bb8060b2111c\") " pod="openshift-marketplace/redhat-operators-m7wjn" Mar 01 09:43:14 crc kubenswrapper[4792]: I0301 09:43:14.883944 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b6b7\" (UniqueName: \"kubernetes.io/projected/0a8e78a0-a8cd-450d-ad43-bb8060b2111c-kube-api-access-4b6b7\") pod \"redhat-operators-m7wjn\" (UID: \"0a8e78a0-a8cd-450d-ad43-bb8060b2111c\") " pod="openshift-marketplace/redhat-operators-m7wjn" Mar 01 09:43:14 crc kubenswrapper[4792]: I0301 09:43:14.884288 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a8e78a0-a8cd-450d-ad43-bb8060b2111c-catalog-content\") pod \"redhat-operators-m7wjn\" (UID: \"0a8e78a0-a8cd-450d-ad43-bb8060b2111c\") " pod="openshift-marketplace/redhat-operators-m7wjn" Mar 01 09:43:14 crc kubenswrapper[4792]: I0301 09:43:14.884362 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a8e78a0-a8cd-450d-ad43-bb8060b2111c-utilities\") pod \"redhat-operators-m7wjn\" (UID: \"0a8e78a0-a8cd-450d-ad43-bb8060b2111c\") " pod="openshift-marketplace/redhat-operators-m7wjn" Mar 01 09:43:14 crc kubenswrapper[4792]: I0301 09:43:14.908745 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b6b7\" (UniqueName: \"kubernetes.io/projected/0a8e78a0-a8cd-450d-ad43-bb8060b2111c-kube-api-access-4b6b7\") pod \"redhat-operators-m7wjn\" (UID: \"0a8e78a0-a8cd-450d-ad43-bb8060b2111c\") " pod="openshift-marketplace/redhat-operators-m7wjn" Mar 01 09:43:14 crc kubenswrapper[4792]: I0301 09:43:14.993538 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m7wjn" Mar 01 09:43:15 crc kubenswrapper[4792]: I0301 09:43:15.504433 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m7wjn"] Mar 01 09:43:15 crc kubenswrapper[4792]: I0301 09:43:15.991154 4792 generic.go:334] "Generic (PLEG): container finished" podID="0a8e78a0-a8cd-450d-ad43-bb8060b2111c" containerID="085cd189a3c67b0c9c3f79768720778b7cba1eb93db881daadfa726dffc69925" exitCode=0 Mar 01 09:43:15 crc kubenswrapper[4792]: I0301 09:43:15.991244 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m7wjn" event={"ID":"0a8e78a0-a8cd-450d-ad43-bb8060b2111c","Type":"ContainerDied","Data":"085cd189a3c67b0c9c3f79768720778b7cba1eb93db881daadfa726dffc69925"} Mar 01 09:43:15 crc kubenswrapper[4792]: I0301 09:43:15.991527 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m7wjn" event={"ID":"0a8e78a0-a8cd-450d-ad43-bb8060b2111c","Type":"ContainerStarted","Data":"10a006007bcbaefd78d690abc403d6226b7d5583a504c774c24e533714ae4bd4"} Mar 01 09:43:17 crc kubenswrapper[4792]: I0301 09:43:17.002251 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m7wjn" event={"ID":"0a8e78a0-a8cd-450d-ad43-bb8060b2111c","Type":"ContainerStarted","Data":"dfb04538eb0c09dde5d7f9310f315e428388d1e50f9cf7cff948a58ab0e26c86"} Mar 01 09:43:22 crc kubenswrapper[4792]: I0301 09:43:22.038245 4792 generic.go:334] "Generic (PLEG): container finished" podID="0a8e78a0-a8cd-450d-ad43-bb8060b2111c" containerID="dfb04538eb0c09dde5d7f9310f315e428388d1e50f9cf7cff948a58ab0e26c86" exitCode=0 Mar 01 09:43:22 crc kubenswrapper[4792]: I0301 09:43:22.038465 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m7wjn" event={"ID":"0a8e78a0-a8cd-450d-ad43-bb8060b2111c","Type":"ContainerDied","Data":"dfb04538eb0c09dde5d7f9310f315e428388d1e50f9cf7cff948a58ab0e26c86"} Mar 01 09:43:23 crc kubenswrapper[4792]: I0301 09:43:23.047093 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m7wjn" event={"ID":"0a8e78a0-a8cd-450d-ad43-bb8060b2111c","Type":"ContainerStarted","Data":"b3aab0c7882b63849c8964a4efa9c80b782da2e3bdd3f64208c27669cf25ba13"} Mar 01 09:43:23 crc kubenswrapper[4792]: I0301 09:43:23.065184 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m7wjn" podStartSLOduration=2.661028159 podStartE2EDuration="9.065166796s" podCreationTimestamp="2026-03-01 09:43:14 +0000 UTC" firstStartedPulling="2026-03-01 09:43:15.992922702 +0000 UTC m=+2125.234801899" lastFinishedPulling="2026-03-01 09:43:22.397061339 +0000 UTC m=+2131.638940536" observedRunningTime="2026-03-01 09:43:23.062354285 +0000 UTC m=+2132.304233482" watchObservedRunningTime="2026-03-01 09:43:23.065166796 +0000 UTC m=+2132.307045993" Mar 01 09:43:24 crc kubenswrapper[4792]: I0301 09:43:24.993645 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m7wjn" Mar 01 09:43:24 crc kubenswrapper[4792]: I0301 09:43:24.993995 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m7wjn" Mar 01 09:43:26 crc kubenswrapper[4792]: I0301 09:43:26.044587 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m7wjn" podUID="0a8e78a0-a8cd-450d-ad43-bb8060b2111c" containerName="registry-server" probeResult="failure" output=< Mar 01 09:43:26 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 09:43:26 crc kubenswrapper[4792]: > Mar 01 09:43:35 crc kubenswrapper[4792]: I0301 09:43:35.037242 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m7wjn" Mar 01 09:43:35 crc kubenswrapper[4792]: I0301 09:43:35.081844 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m7wjn" Mar 01 09:43:35 crc kubenswrapper[4792]: I0301 09:43:35.273521 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m7wjn"] Mar 01 09:43:36 crc kubenswrapper[4792]: I0301 09:43:36.153282 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m7wjn" podUID="0a8e78a0-a8cd-450d-ad43-bb8060b2111c" containerName="registry-server" containerID="cri-o://b3aab0c7882b63849c8964a4efa9c80b782da2e3bdd3f64208c27669cf25ba13" gracePeriod=2 Mar 01 09:43:36 crc kubenswrapper[4792]: I0301 09:43:36.532492 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m7wjn" Mar 01 09:43:36 crc kubenswrapper[4792]: I0301 09:43:36.726222 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4b6b7\" (UniqueName: \"kubernetes.io/projected/0a8e78a0-a8cd-450d-ad43-bb8060b2111c-kube-api-access-4b6b7\") pod \"0a8e78a0-a8cd-450d-ad43-bb8060b2111c\" (UID: \"0a8e78a0-a8cd-450d-ad43-bb8060b2111c\") " Mar 01 09:43:36 crc kubenswrapper[4792]: I0301 09:43:36.726722 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a8e78a0-a8cd-450d-ad43-bb8060b2111c-catalog-content\") pod \"0a8e78a0-a8cd-450d-ad43-bb8060b2111c\" (UID: \"0a8e78a0-a8cd-450d-ad43-bb8060b2111c\") " Mar 01 09:43:36 crc kubenswrapper[4792]: I0301 09:43:36.726775 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a8e78a0-a8cd-450d-ad43-bb8060b2111c-utilities\") pod \"0a8e78a0-a8cd-450d-ad43-bb8060b2111c\" (UID: \"0a8e78a0-a8cd-450d-ad43-bb8060b2111c\") " Mar 01 09:43:36 crc kubenswrapper[4792]: I0301 09:43:36.727326 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a8e78a0-a8cd-450d-ad43-bb8060b2111c-utilities" (OuterVolumeSpecName: "utilities") pod "0a8e78a0-a8cd-450d-ad43-bb8060b2111c" (UID: "0a8e78a0-a8cd-450d-ad43-bb8060b2111c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:43:36 crc kubenswrapper[4792]: I0301 09:43:36.734659 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a8e78a0-a8cd-450d-ad43-bb8060b2111c-kube-api-access-4b6b7" (OuterVolumeSpecName: "kube-api-access-4b6b7") pod "0a8e78a0-a8cd-450d-ad43-bb8060b2111c" (UID: "0a8e78a0-a8cd-450d-ad43-bb8060b2111c"). InnerVolumeSpecName "kube-api-access-4b6b7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:43:36 crc kubenswrapper[4792]: I0301 09:43:36.829266 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a8e78a0-a8cd-450d-ad43-bb8060b2111c-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:43:36 crc kubenswrapper[4792]: I0301 09:43:36.829756 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4b6b7\" (UniqueName: \"kubernetes.io/projected/0a8e78a0-a8cd-450d-ad43-bb8060b2111c-kube-api-access-4b6b7\") on node \"crc\" DevicePath \"\"" Mar 01 09:43:36 crc kubenswrapper[4792]: I0301 09:43:36.845272 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a8e78a0-a8cd-450d-ad43-bb8060b2111c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0a8e78a0-a8cd-450d-ad43-bb8060b2111c" (UID: "0a8e78a0-a8cd-450d-ad43-bb8060b2111c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:43:36 crc kubenswrapper[4792]: I0301 09:43:36.932675 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a8e78a0-a8cd-450d-ad43-bb8060b2111c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:43:37 crc kubenswrapper[4792]: I0301 09:43:37.173274 4792 generic.go:334] "Generic (PLEG): container finished" podID="0a8e78a0-a8cd-450d-ad43-bb8060b2111c" containerID="b3aab0c7882b63849c8964a4efa9c80b782da2e3bdd3f64208c27669cf25ba13" exitCode=0 Mar 01 09:43:37 crc kubenswrapper[4792]: I0301 09:43:37.173314 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m7wjn" event={"ID":"0a8e78a0-a8cd-450d-ad43-bb8060b2111c","Type":"ContainerDied","Data":"b3aab0c7882b63849c8964a4efa9c80b782da2e3bdd3f64208c27669cf25ba13"} Mar 01 09:43:37 crc kubenswrapper[4792]: I0301 09:43:37.173339 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m7wjn" event={"ID":"0a8e78a0-a8cd-450d-ad43-bb8060b2111c","Type":"ContainerDied","Data":"10a006007bcbaefd78d690abc403d6226b7d5583a504c774c24e533714ae4bd4"} Mar 01 09:43:37 crc kubenswrapper[4792]: I0301 09:43:37.173348 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m7wjn" Mar 01 09:43:37 crc kubenswrapper[4792]: I0301 09:43:37.173363 4792 scope.go:117] "RemoveContainer" containerID="b3aab0c7882b63849c8964a4efa9c80b782da2e3bdd3f64208c27669cf25ba13" Mar 01 09:43:37 crc kubenswrapper[4792]: I0301 09:43:37.189787 4792 scope.go:117] "RemoveContainer" containerID="dfb04538eb0c09dde5d7f9310f315e428388d1e50f9cf7cff948a58ab0e26c86" Mar 01 09:43:37 crc kubenswrapper[4792]: I0301 09:43:37.218230 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m7wjn"] Mar 01 09:43:37 crc kubenswrapper[4792]: I0301 09:43:37.224237 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m7wjn"] Mar 01 09:43:37 crc kubenswrapper[4792]: I0301 09:43:37.226110 4792 scope.go:117] "RemoveContainer" containerID="085cd189a3c67b0c9c3f79768720778b7cba1eb93db881daadfa726dffc69925" Mar 01 09:43:37 crc kubenswrapper[4792]: I0301 09:43:37.263367 4792 scope.go:117] "RemoveContainer" containerID="b3aab0c7882b63849c8964a4efa9c80b782da2e3bdd3f64208c27669cf25ba13" Mar 01 09:43:37 crc kubenswrapper[4792]: E0301 09:43:37.265314 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3aab0c7882b63849c8964a4efa9c80b782da2e3bdd3f64208c27669cf25ba13\": container with ID starting with b3aab0c7882b63849c8964a4efa9c80b782da2e3bdd3f64208c27669cf25ba13 not found: ID does not exist" containerID="b3aab0c7882b63849c8964a4efa9c80b782da2e3bdd3f64208c27669cf25ba13" Mar 01 09:43:37 crc kubenswrapper[4792]: I0301 09:43:37.265361 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3aab0c7882b63849c8964a4efa9c80b782da2e3bdd3f64208c27669cf25ba13"} err="failed to get container status \"b3aab0c7882b63849c8964a4efa9c80b782da2e3bdd3f64208c27669cf25ba13\": rpc error: code = NotFound desc = could not find container \"b3aab0c7882b63849c8964a4efa9c80b782da2e3bdd3f64208c27669cf25ba13\": container with ID starting with b3aab0c7882b63849c8964a4efa9c80b782da2e3bdd3f64208c27669cf25ba13 not found: ID does not exist" Mar 01 09:43:37 crc kubenswrapper[4792]: I0301 09:43:37.265386 4792 scope.go:117] "RemoveContainer" containerID="dfb04538eb0c09dde5d7f9310f315e428388d1e50f9cf7cff948a58ab0e26c86" Mar 01 09:43:37 crc kubenswrapper[4792]: E0301 09:43:37.266621 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfb04538eb0c09dde5d7f9310f315e428388d1e50f9cf7cff948a58ab0e26c86\": container with ID starting with dfb04538eb0c09dde5d7f9310f315e428388d1e50f9cf7cff948a58ab0e26c86 not found: ID does not exist" containerID="dfb04538eb0c09dde5d7f9310f315e428388d1e50f9cf7cff948a58ab0e26c86" Mar 01 09:43:37 crc kubenswrapper[4792]: I0301 09:43:37.266653 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfb04538eb0c09dde5d7f9310f315e428388d1e50f9cf7cff948a58ab0e26c86"} err="failed to get container status \"dfb04538eb0c09dde5d7f9310f315e428388d1e50f9cf7cff948a58ab0e26c86\": rpc error: code = NotFound desc = could not find container \"dfb04538eb0c09dde5d7f9310f315e428388d1e50f9cf7cff948a58ab0e26c86\": container with ID starting with dfb04538eb0c09dde5d7f9310f315e428388d1e50f9cf7cff948a58ab0e26c86 not found: ID does not exist" Mar 01 09:43:37 crc kubenswrapper[4792]: I0301 09:43:37.266694 4792 scope.go:117] "RemoveContainer" containerID="085cd189a3c67b0c9c3f79768720778b7cba1eb93db881daadfa726dffc69925" Mar 01 09:43:37 crc kubenswrapper[4792]: E0301 09:43:37.266989 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"085cd189a3c67b0c9c3f79768720778b7cba1eb93db881daadfa726dffc69925\": container with ID starting with 085cd189a3c67b0c9c3f79768720778b7cba1eb93db881daadfa726dffc69925 not found: ID does not exist" containerID="085cd189a3c67b0c9c3f79768720778b7cba1eb93db881daadfa726dffc69925" Mar 01 09:43:37 crc kubenswrapper[4792]: I0301 09:43:37.267089 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"085cd189a3c67b0c9c3f79768720778b7cba1eb93db881daadfa726dffc69925"} err="failed to get container status \"085cd189a3c67b0c9c3f79768720778b7cba1eb93db881daadfa726dffc69925\": rpc error: code = NotFound desc = could not find container \"085cd189a3c67b0c9c3f79768720778b7cba1eb93db881daadfa726dffc69925\": container with ID starting with 085cd189a3c67b0c9c3f79768720778b7cba1eb93db881daadfa726dffc69925 not found: ID does not exist" Mar 01 09:43:37 crc kubenswrapper[4792]: I0301 09:43:37.418161 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a8e78a0-a8cd-450d-ad43-bb8060b2111c" path="/var/lib/kubelet/pods/0a8e78a0-a8cd-450d-ad43-bb8060b2111c/volumes" Mar 01 09:44:00 crc kubenswrapper[4792]: I0301 09:44:00.144765 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539304-c2vn2"] Mar 01 09:44:00 crc kubenswrapper[4792]: E0301 09:44:00.145674 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a8e78a0-a8cd-450d-ad43-bb8060b2111c" containerName="extract-utilities" Mar 01 09:44:00 crc kubenswrapper[4792]: I0301 09:44:00.145689 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a8e78a0-a8cd-450d-ad43-bb8060b2111c" containerName="extract-utilities" Mar 01 09:44:00 crc kubenswrapper[4792]: E0301 09:44:00.145707 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a8e78a0-a8cd-450d-ad43-bb8060b2111c" containerName="extract-content" Mar 01 09:44:00 crc kubenswrapper[4792]: I0301 09:44:00.145714 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a8e78a0-a8cd-450d-ad43-bb8060b2111c" containerName="extract-content" Mar 01 09:44:00 crc kubenswrapper[4792]: E0301 09:44:00.145748 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a8e78a0-a8cd-450d-ad43-bb8060b2111c" containerName="registry-server" Mar 01 09:44:00 crc kubenswrapper[4792]: I0301 09:44:00.145757 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a8e78a0-a8cd-450d-ad43-bb8060b2111c" containerName="registry-server" Mar 01 09:44:00 crc kubenswrapper[4792]: I0301 09:44:00.146045 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a8e78a0-a8cd-450d-ad43-bb8060b2111c" containerName="registry-server" Mar 01 09:44:00 crc kubenswrapper[4792]: I0301 09:44:00.146746 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539304-c2vn2" Mar 01 09:44:00 crc kubenswrapper[4792]: I0301 09:44:00.151419 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:44:00 crc kubenswrapper[4792]: I0301 09:44:00.151497 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:44:00 crc kubenswrapper[4792]: I0301 09:44:00.151590 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:44:00 crc kubenswrapper[4792]: I0301 09:44:00.158473 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539304-c2vn2"] Mar 01 09:44:00 crc kubenswrapper[4792]: I0301 09:44:00.253648 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6szt8\" (UniqueName: \"kubernetes.io/projected/97e68f99-8c1f-4046-bb89-66516bff6370-kube-api-access-6szt8\") pod \"auto-csr-approver-29539304-c2vn2\" (UID: \"97e68f99-8c1f-4046-bb89-66516bff6370\") " pod="openshift-infra/auto-csr-approver-29539304-c2vn2" Mar 01 09:44:00 crc kubenswrapper[4792]: I0301 09:44:00.355822 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6szt8\" (UniqueName: \"kubernetes.io/projected/97e68f99-8c1f-4046-bb89-66516bff6370-kube-api-access-6szt8\") pod \"auto-csr-approver-29539304-c2vn2\" (UID: \"97e68f99-8c1f-4046-bb89-66516bff6370\") " pod="openshift-infra/auto-csr-approver-29539304-c2vn2" Mar 01 09:44:00 crc kubenswrapper[4792]: I0301 09:44:00.380431 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6szt8\" (UniqueName: \"kubernetes.io/projected/97e68f99-8c1f-4046-bb89-66516bff6370-kube-api-access-6szt8\") pod \"auto-csr-approver-29539304-c2vn2\" (UID: \"97e68f99-8c1f-4046-bb89-66516bff6370\") " pod="openshift-infra/auto-csr-approver-29539304-c2vn2" Mar 01 09:44:00 crc kubenswrapper[4792]: I0301 09:44:00.478732 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539304-c2vn2" Mar 01 09:44:00 crc kubenswrapper[4792]: I0301 09:44:00.903854 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539304-c2vn2"] Mar 01 09:44:01 crc kubenswrapper[4792]: I0301 09:44:01.361511 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539304-c2vn2" event={"ID":"97e68f99-8c1f-4046-bb89-66516bff6370","Type":"ContainerStarted","Data":"4e11349cda068ff64a159b98fad5cbc797c08e7d4f6ef9edc61ac56dff713ded"} Mar 01 09:44:02 crc kubenswrapper[4792]: I0301 09:44:02.371102 4792 generic.go:334] "Generic (PLEG): container finished" podID="97e68f99-8c1f-4046-bb89-66516bff6370" containerID="3cbaa243041e250919798684d495339949ab384e80a45c460f4d0e0c2cfab407" exitCode=0 Mar 01 09:44:02 crc kubenswrapper[4792]: I0301 09:44:02.371207 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539304-c2vn2" event={"ID":"97e68f99-8c1f-4046-bb89-66516bff6370","Type":"ContainerDied","Data":"3cbaa243041e250919798684d495339949ab384e80a45c460f4d0e0c2cfab407"} Mar 01 09:44:03 crc kubenswrapper[4792]: I0301 09:44:03.716925 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539304-c2vn2" Mar 01 09:44:03 crc kubenswrapper[4792]: I0301 09:44:03.818847 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6szt8\" (UniqueName: \"kubernetes.io/projected/97e68f99-8c1f-4046-bb89-66516bff6370-kube-api-access-6szt8\") pod \"97e68f99-8c1f-4046-bb89-66516bff6370\" (UID: \"97e68f99-8c1f-4046-bb89-66516bff6370\") " Mar 01 09:44:03 crc kubenswrapper[4792]: I0301 09:44:03.825036 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97e68f99-8c1f-4046-bb89-66516bff6370-kube-api-access-6szt8" (OuterVolumeSpecName: "kube-api-access-6szt8") pod "97e68f99-8c1f-4046-bb89-66516bff6370" (UID: "97e68f99-8c1f-4046-bb89-66516bff6370"). InnerVolumeSpecName "kube-api-access-6szt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:44:03 crc kubenswrapper[4792]: I0301 09:44:03.920758 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6szt8\" (UniqueName: \"kubernetes.io/projected/97e68f99-8c1f-4046-bb89-66516bff6370-kube-api-access-6szt8\") on node \"crc\" DevicePath \"\"" Mar 01 09:44:04 crc kubenswrapper[4792]: I0301 09:44:04.390079 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539304-c2vn2" event={"ID":"97e68f99-8c1f-4046-bb89-66516bff6370","Type":"ContainerDied","Data":"4e11349cda068ff64a159b98fad5cbc797c08e7d4f6ef9edc61ac56dff713ded"} Mar 01 09:44:04 crc kubenswrapper[4792]: I0301 09:44:04.390376 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e11349cda068ff64a159b98fad5cbc797c08e7d4f6ef9edc61ac56dff713ded" Mar 01 09:44:04 crc kubenswrapper[4792]: I0301 09:44:04.390447 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539304-c2vn2" Mar 01 09:44:04 crc kubenswrapper[4792]: I0301 09:44:04.785392 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539298-ckkqh"] Mar 01 09:44:04 crc kubenswrapper[4792]: I0301 09:44:04.792244 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539298-ckkqh"] Mar 01 09:44:05 crc kubenswrapper[4792]: I0301 09:44:05.417323 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e9f36fa-467b-4b49-9d69-b465a22837e5" path="/var/lib/kubelet/pods/7e9f36fa-467b-4b49-9d69-b465a22837e5/volumes" Mar 01 09:44:30 crc kubenswrapper[4792]: I0301 09:44:30.085704 4792 scope.go:117] "RemoveContainer" containerID="fe8d25b14be5e63dea82359594e611616a93ae643f10a8ff38209498ecbc612f" Mar 01 09:45:00 crc kubenswrapper[4792]: I0301 09:45:00.144482 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz"] Mar 01 09:45:00 crc kubenswrapper[4792]: E0301 09:45:00.145475 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97e68f99-8c1f-4046-bb89-66516bff6370" containerName="oc" Mar 01 09:45:00 crc kubenswrapper[4792]: I0301 09:45:00.145492 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="97e68f99-8c1f-4046-bb89-66516bff6370" containerName="oc" Mar 01 09:45:00 crc kubenswrapper[4792]: I0301 09:45:00.145720 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="97e68f99-8c1f-4046-bb89-66516bff6370" containerName="oc" Mar 01 09:45:00 crc kubenswrapper[4792]: I0301 09:45:00.146486 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz" Mar 01 09:45:00 crc kubenswrapper[4792]: I0301 09:45:00.149355 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 01 09:45:00 crc kubenswrapper[4792]: I0301 09:45:00.149854 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 01 09:45:00 crc kubenswrapper[4792]: I0301 09:45:00.156426 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz"] Mar 01 09:45:00 crc kubenswrapper[4792]: I0301 09:45:00.242382 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c-secret-volume\") pod \"collect-profiles-29539305-ml8xz\" (UID: \"e7a9ad8e-1c99-4a79-87eb-912aab1dc48c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz" Mar 01 09:45:00 crc kubenswrapper[4792]: I0301 09:45:00.242473 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c-config-volume\") pod \"collect-profiles-29539305-ml8xz\" (UID: \"e7a9ad8e-1c99-4a79-87eb-912aab1dc48c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz" Mar 01 09:45:00 crc kubenswrapper[4792]: I0301 09:45:00.242494 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4xqp\" (UniqueName: \"kubernetes.io/projected/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c-kube-api-access-k4xqp\") pod \"collect-profiles-29539305-ml8xz\" (UID: \"e7a9ad8e-1c99-4a79-87eb-912aab1dc48c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz" Mar 01 09:45:00 crc kubenswrapper[4792]: I0301 09:45:00.343895 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c-secret-volume\") pod \"collect-profiles-29539305-ml8xz\" (UID: \"e7a9ad8e-1c99-4a79-87eb-912aab1dc48c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz" Mar 01 09:45:00 crc kubenswrapper[4792]: I0301 09:45:00.343992 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c-config-volume\") pod \"collect-profiles-29539305-ml8xz\" (UID: \"e7a9ad8e-1c99-4a79-87eb-912aab1dc48c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz" Mar 01 09:45:00 crc kubenswrapper[4792]: I0301 09:45:00.344018 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4xqp\" (UniqueName: \"kubernetes.io/projected/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c-kube-api-access-k4xqp\") pod \"collect-profiles-29539305-ml8xz\" (UID: \"e7a9ad8e-1c99-4a79-87eb-912aab1dc48c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz" Mar 01 09:45:00 crc kubenswrapper[4792]: I0301 09:45:00.344959 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c-config-volume\") pod \"collect-profiles-29539305-ml8xz\" (UID: \"e7a9ad8e-1c99-4a79-87eb-912aab1dc48c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz" Mar 01 09:45:00 crc kubenswrapper[4792]: I0301 09:45:00.350201 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c-secret-volume\") pod \"collect-profiles-29539305-ml8xz\" (UID: \"e7a9ad8e-1c99-4a79-87eb-912aab1dc48c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz" Mar 01 09:45:00 crc kubenswrapper[4792]: I0301 09:45:00.365467 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4xqp\" (UniqueName: \"kubernetes.io/projected/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c-kube-api-access-k4xqp\") pod \"collect-profiles-29539305-ml8xz\" (UID: \"e7a9ad8e-1c99-4a79-87eb-912aab1dc48c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz" Mar 01 09:45:00 crc kubenswrapper[4792]: I0301 09:45:00.466316 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz" Mar 01 09:45:00 crc kubenswrapper[4792]: I0301 09:45:00.893505 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz"] Mar 01 09:45:01 crc kubenswrapper[4792]: I0301 09:45:01.862372 4792 generic.go:334] "Generic (PLEG): container finished" podID="e7a9ad8e-1c99-4a79-87eb-912aab1dc48c" containerID="5aa67c39154d74ad3f75b4616f3e6439b947759682dc6085d9ce37f8cd99894c" exitCode=0 Mar 01 09:45:01 crc kubenswrapper[4792]: I0301 09:45:01.862418 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz" event={"ID":"e7a9ad8e-1c99-4a79-87eb-912aab1dc48c","Type":"ContainerDied","Data":"5aa67c39154d74ad3f75b4616f3e6439b947759682dc6085d9ce37f8cd99894c"} Mar 01 09:45:01 crc kubenswrapper[4792]: I0301 09:45:01.862680 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz" event={"ID":"e7a9ad8e-1c99-4a79-87eb-912aab1dc48c","Type":"ContainerStarted","Data":"0df8929d04bb8c6b5b519f1a9969780072c9fcb0dfa7729cd770332cd5028e91"} Mar 01 09:45:03 crc kubenswrapper[4792]: I0301 09:45:03.222148 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz" Mar 01 09:45:03 crc kubenswrapper[4792]: I0301 09:45:03.294356 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c-secret-volume\") pod \"e7a9ad8e-1c99-4a79-87eb-912aab1dc48c\" (UID: \"e7a9ad8e-1c99-4a79-87eb-912aab1dc48c\") " Mar 01 09:45:03 crc kubenswrapper[4792]: I0301 09:45:03.294462 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c-config-volume\") pod \"e7a9ad8e-1c99-4a79-87eb-912aab1dc48c\" (UID: \"e7a9ad8e-1c99-4a79-87eb-912aab1dc48c\") " Mar 01 09:45:03 crc kubenswrapper[4792]: I0301 09:45:03.294523 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4xqp\" (UniqueName: \"kubernetes.io/projected/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c-kube-api-access-k4xqp\") pod \"e7a9ad8e-1c99-4a79-87eb-912aab1dc48c\" (UID: \"e7a9ad8e-1c99-4a79-87eb-912aab1dc48c\") " Mar 01 09:45:03 crc kubenswrapper[4792]: I0301 09:45:03.295468 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c-config-volume" (OuterVolumeSpecName: "config-volume") pod "e7a9ad8e-1c99-4a79-87eb-912aab1dc48c" (UID: "e7a9ad8e-1c99-4a79-87eb-912aab1dc48c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:45:03 crc kubenswrapper[4792]: I0301 09:45:03.305180 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e7a9ad8e-1c99-4a79-87eb-912aab1dc48c" (UID: "e7a9ad8e-1c99-4a79-87eb-912aab1dc48c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:45:03 crc kubenswrapper[4792]: I0301 09:45:03.305250 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c-kube-api-access-k4xqp" (OuterVolumeSpecName: "kube-api-access-k4xqp") pod "e7a9ad8e-1c99-4a79-87eb-912aab1dc48c" (UID: "e7a9ad8e-1c99-4a79-87eb-912aab1dc48c"). InnerVolumeSpecName "kube-api-access-k4xqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:45:03 crc kubenswrapper[4792]: I0301 09:45:03.396863 4792 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 01 09:45:03 crc kubenswrapper[4792]: I0301 09:45:03.396899 4792 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c-config-volume\") on node \"crc\" DevicePath \"\"" Mar 01 09:45:03 crc kubenswrapper[4792]: I0301 09:45:03.396932 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4xqp\" (UniqueName: \"kubernetes.io/projected/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c-kube-api-access-k4xqp\") on node \"crc\" DevicePath \"\"" Mar 01 09:45:03 crc kubenswrapper[4792]: I0301 09:45:03.877418 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz" event={"ID":"e7a9ad8e-1c99-4a79-87eb-912aab1dc48c","Type":"ContainerDied","Data":"0df8929d04bb8c6b5b519f1a9969780072c9fcb0dfa7729cd770332cd5028e91"} Mar 01 09:45:03 crc kubenswrapper[4792]: I0301 09:45:03.877460 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0df8929d04bb8c6b5b519f1a9969780072c9fcb0dfa7729cd770332cd5028e91" Mar 01 09:45:03 crc kubenswrapper[4792]: I0301 09:45:03.877580 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz" Mar 01 09:45:04 crc kubenswrapper[4792]: I0301 09:45:04.296810 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn"] Mar 01 09:45:04 crc kubenswrapper[4792]: I0301 09:45:04.304131 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539260-g6qtn"] Mar 01 09:45:04 crc kubenswrapper[4792]: I0301 09:45:04.943249 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:45:04 crc kubenswrapper[4792]: I0301 09:45:04.943298 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:45:05 crc kubenswrapper[4792]: I0301 09:45:05.422152 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3" path="/var/lib/kubelet/pods/6c1a0aad-45a6-45d3-bc5d-bbbf2e4fdcc3/volumes" Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.089830 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.095504 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.104014 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.111439 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.117845 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nxkcw"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.124103 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rqpgs"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.129867 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.135536 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.141779 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9tbn2"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.147433 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-27xxx"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.152797 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-j8zd5"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.158146 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-l2lkt"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.163437 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.170567 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.177203 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7zbj"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.185000 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-27xxx"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.191178 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ms8fh"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.197095 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8qfln"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.202675 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl"] Mar 01 09:45:08 crc kubenswrapper[4792]: I0301 09:45:08.208038 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kbjsl"] Mar 01 09:45:09 crc kubenswrapper[4792]: I0301 09:45:09.419667 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f054d9d-4fbb-4909-826c-e6037c4716bd" path="/var/lib/kubelet/pods/1f054d9d-4fbb-4909-826c-e6037c4716bd/volumes" Mar 01 09:45:09 crc kubenswrapper[4792]: I0301 09:45:09.421560 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8" path="/var/lib/kubelet/pods/31cd2c0e-a66b-4944-9fbf-0e09deaf2ee8/volumes" Mar 01 09:45:09 crc kubenswrapper[4792]: I0301 09:45:09.422738 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a742181-aebe-42f8-a83e-fee7b480366b" path="/var/lib/kubelet/pods/4a742181-aebe-42f8-a83e-fee7b480366b/volumes" Mar 01 09:45:09 crc kubenswrapper[4792]: I0301 09:45:09.423416 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54e68c85-54c7-4855-b4a0-a85d2014c7b7" path="/var/lib/kubelet/pods/54e68c85-54c7-4855-b4a0-a85d2014c7b7/volumes" Mar 01 09:45:09 crc kubenswrapper[4792]: I0301 09:45:09.424560 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c6429ad-21d8-4f58-900b-e5f6fe4d603d" path="/var/lib/kubelet/pods/5c6429ad-21d8-4f58-900b-e5f6fe4d603d/volumes" Mar 01 09:45:09 crc kubenswrapper[4792]: I0301 09:45:09.425482 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8787b5ba-7462-4594-a11d-2d0afbfe3c1c" path="/var/lib/kubelet/pods/8787b5ba-7462-4594-a11d-2d0afbfe3c1c/volumes" Mar 01 09:45:09 crc kubenswrapper[4792]: I0301 09:45:09.426055 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91d95c97-b82e-413c-b05a-3e9cb36e504e" path="/var/lib/kubelet/pods/91d95c97-b82e-413c-b05a-3e9cb36e504e/volumes" Mar 01 09:45:09 crc kubenswrapper[4792]: I0301 09:45:09.427022 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9af8a1fb-52d8-4b08-be39-ad106833ba1c" path="/var/lib/kubelet/pods/9af8a1fb-52d8-4b08-be39-ad106833ba1c/volumes" Mar 01 09:45:09 crc kubenswrapper[4792]: I0301 09:45:09.427570 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6a7e948-b141-4fb0-b717-3d02a9014dd4" path="/var/lib/kubelet/pods/a6a7e948-b141-4fb0-b717-3d02a9014dd4/volumes" Mar 01 09:45:09 crc kubenswrapper[4792]: I0301 09:45:09.428129 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5ccb279-c8b2-4288-9072-1175061be204" path="/var/lib/kubelet/pods/b5ccb279-c8b2-4288-9072-1175061be204/volumes" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.101350 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64"] Mar 01 09:45:21 crc kubenswrapper[4792]: E0301 09:45:21.102331 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7a9ad8e-1c99-4a79-87eb-912aab1dc48c" containerName="collect-profiles" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.102347 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7a9ad8e-1c99-4a79-87eb-912aab1dc48c" containerName="collect-profiles" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.102794 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7a9ad8e-1c99-4a79-87eb-912aab1dc48c" containerName="collect-profiles" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.103430 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.106888 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.107193 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.107417 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.107707 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.107950 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.120489 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64"] Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.205131 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.205259 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqf5q\" (UniqueName: \"kubernetes.io/projected/6c517000-6918-4f58-871b-7c4d26197ccf-kube-api-access-rqf5q\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.205289 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.205318 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.205356 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.307256 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqf5q\" (UniqueName: \"kubernetes.io/projected/6c517000-6918-4f58-871b-7c4d26197ccf-kube-api-access-rqf5q\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.307313 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.307360 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.307413 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.307488 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.313179 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.313987 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.319240 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.320659 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.325000 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqf5q\" (UniqueName: \"kubernetes.io/projected/6c517000-6918-4f58-871b-7c4d26197ccf-kube-api-access-rqf5q\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:21 crc kubenswrapper[4792]: I0301 09:45:21.480710 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:22 crc kubenswrapper[4792]: I0301 09:45:22.035584 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64"] Mar 01 09:45:22 crc kubenswrapper[4792]: I0301 09:45:22.041478 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 01 09:45:22 crc kubenswrapper[4792]: I0301 09:45:22.074397 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" event={"ID":"6c517000-6918-4f58-871b-7c4d26197ccf","Type":"ContainerStarted","Data":"9c1974b1db8a8438cbc5a90dfa75c48b2a61a08d08a187b7c66e02d0f00f2799"} Mar 01 09:45:23 crc kubenswrapper[4792]: I0301 09:45:23.084111 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" event={"ID":"6c517000-6918-4f58-871b-7c4d26197ccf","Type":"ContainerStarted","Data":"9485400e953b4441f321cb6741c7780ced074d1502b17d2cd0f2132161817682"} Mar 01 09:45:30 crc kubenswrapper[4792]: I0301 09:45:30.214291 4792 scope.go:117] "RemoveContainer" containerID="5773fb565f7454c83bd5a97647f2258db8af4721a7199d6a6eb816399f3a0abe" Mar 01 09:45:30 crc kubenswrapper[4792]: I0301 09:45:30.246219 4792 scope.go:117] "RemoveContainer" containerID="6ad37eb5a9ce8285310a5d61f804630e8b0a3954519985d12a0d71d52e93d217" Mar 01 09:45:30 crc kubenswrapper[4792]: I0301 09:45:30.296714 4792 scope.go:117] "RemoveContainer" containerID="40d6bf3bff7640be17f842b1e208ca6cbd13ff723d54eb172fb51e9f37d11d71" Mar 01 09:45:30 crc kubenswrapper[4792]: I0301 09:45:30.355168 4792 scope.go:117] "RemoveContainer" containerID="2b11700043caea12ce3ec9b1a685865b6228db2c927ef984914abec9ff9701b8" Mar 01 09:45:30 crc kubenswrapper[4792]: I0301 09:45:30.420478 4792 scope.go:117] "RemoveContainer" containerID="a0ae22fc112a93c604faf629d5ca18987a7aa343d92829f2e65e4501f2454496" Mar 01 09:45:30 crc kubenswrapper[4792]: I0301 09:45:30.456655 4792 scope.go:117] "RemoveContainer" containerID="1fcee8427ea6340db8e69cb0e43a52de1fe2f18dc84e960d49fc0b0918052c29" Mar 01 09:45:30 crc kubenswrapper[4792]: I0301 09:45:30.573121 4792 scope.go:117] "RemoveContainer" containerID="291e30822651ff807afe1c8290d577ee02386c82661acb3758a8b13541958167" Mar 01 09:45:30 crc kubenswrapper[4792]: I0301 09:45:30.623140 4792 scope.go:117] "RemoveContainer" containerID="263ab29e6e451a06b962c7883f7e5448fbe4595217de2d6fcca680e67789bfae" Mar 01 09:45:30 crc kubenswrapper[4792]: I0301 09:45:30.652081 4792 scope.go:117] "RemoveContainer" containerID="db93ce9b530ac529df661a0d9b5aa2418392875dd62f9aecf46dc33ff7dfd43c" Mar 01 09:45:31 crc kubenswrapper[4792]: I0301 09:45:31.293931 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" podStartSLOduration=9.860264431000001 podStartE2EDuration="10.293892714s" podCreationTimestamp="2026-03-01 09:45:21 +0000 UTC" firstStartedPulling="2026-03-01 09:45:22.041134685 +0000 UTC m=+2251.283013882" lastFinishedPulling="2026-03-01 09:45:22.474762968 +0000 UTC m=+2251.716642165" observedRunningTime="2026-03-01 09:45:23.104201431 +0000 UTC m=+2252.346080638" watchObservedRunningTime="2026-03-01 09:45:31.293892714 +0000 UTC m=+2260.535771911" Mar 01 09:45:31 crc kubenswrapper[4792]: I0301 09:45:31.296779 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6s29r"] Mar 01 09:45:31 crc kubenswrapper[4792]: I0301 09:45:31.298657 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6s29r" Mar 01 09:45:31 crc kubenswrapper[4792]: I0301 09:45:31.322034 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6s29r"] Mar 01 09:45:31 crc kubenswrapper[4792]: I0301 09:45:31.384763 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fab28167-0dde-44ec-a712-e11f418fd4e7-utilities\") pod \"community-operators-6s29r\" (UID: \"fab28167-0dde-44ec-a712-e11f418fd4e7\") " pod="openshift-marketplace/community-operators-6s29r" Mar 01 09:45:31 crc kubenswrapper[4792]: I0301 09:45:31.384862 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqj95\" (UniqueName: \"kubernetes.io/projected/fab28167-0dde-44ec-a712-e11f418fd4e7-kube-api-access-gqj95\") pod \"community-operators-6s29r\" (UID: \"fab28167-0dde-44ec-a712-e11f418fd4e7\") " pod="openshift-marketplace/community-operators-6s29r" Mar 01 09:45:31 crc kubenswrapper[4792]: I0301 09:45:31.384924 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fab28167-0dde-44ec-a712-e11f418fd4e7-catalog-content\") pod \"community-operators-6s29r\" (UID: \"fab28167-0dde-44ec-a712-e11f418fd4e7\") " pod="openshift-marketplace/community-operators-6s29r" Mar 01 09:45:31 crc kubenswrapper[4792]: I0301 09:45:31.486700 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqj95\" (UniqueName: \"kubernetes.io/projected/fab28167-0dde-44ec-a712-e11f418fd4e7-kube-api-access-gqj95\") pod \"community-operators-6s29r\" (UID: \"fab28167-0dde-44ec-a712-e11f418fd4e7\") " pod="openshift-marketplace/community-operators-6s29r" Mar 01 09:45:31 crc kubenswrapper[4792]: I0301 09:45:31.486800 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fab28167-0dde-44ec-a712-e11f418fd4e7-catalog-content\") pod \"community-operators-6s29r\" (UID: \"fab28167-0dde-44ec-a712-e11f418fd4e7\") " pod="openshift-marketplace/community-operators-6s29r" Mar 01 09:45:31 crc kubenswrapper[4792]: I0301 09:45:31.486940 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fab28167-0dde-44ec-a712-e11f418fd4e7-utilities\") pod \"community-operators-6s29r\" (UID: \"fab28167-0dde-44ec-a712-e11f418fd4e7\") " pod="openshift-marketplace/community-operators-6s29r" Mar 01 09:45:31 crc kubenswrapper[4792]: I0301 09:45:31.487259 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fab28167-0dde-44ec-a712-e11f418fd4e7-catalog-content\") pod \"community-operators-6s29r\" (UID: \"fab28167-0dde-44ec-a712-e11f418fd4e7\") " pod="openshift-marketplace/community-operators-6s29r" Mar 01 09:45:31 crc kubenswrapper[4792]: I0301 09:45:31.488105 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fab28167-0dde-44ec-a712-e11f418fd4e7-utilities\") pod \"community-operators-6s29r\" (UID: \"fab28167-0dde-44ec-a712-e11f418fd4e7\") " pod="openshift-marketplace/community-operators-6s29r" Mar 01 09:45:31 crc kubenswrapper[4792]: I0301 09:45:31.522732 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqj95\" (UniqueName: \"kubernetes.io/projected/fab28167-0dde-44ec-a712-e11f418fd4e7-kube-api-access-gqj95\") pod \"community-operators-6s29r\" (UID: \"fab28167-0dde-44ec-a712-e11f418fd4e7\") " pod="openshift-marketplace/community-operators-6s29r" Mar 01 09:45:31 crc kubenswrapper[4792]: I0301 09:45:31.613656 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6s29r" Mar 01 09:45:32 crc kubenswrapper[4792]: I0301 09:45:32.202420 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6s29r"] Mar 01 09:45:33 crc kubenswrapper[4792]: I0301 09:45:33.164144 4792 generic.go:334] "Generic (PLEG): container finished" podID="fab28167-0dde-44ec-a712-e11f418fd4e7" containerID="0e5e77d1eebb5d5009420264a58cc31b26cb46afeb4f4d6643cae7cd76caa0a9" exitCode=0 Mar 01 09:45:33 crc kubenswrapper[4792]: I0301 09:45:33.164257 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6s29r" event={"ID":"fab28167-0dde-44ec-a712-e11f418fd4e7","Type":"ContainerDied","Data":"0e5e77d1eebb5d5009420264a58cc31b26cb46afeb4f4d6643cae7cd76caa0a9"} Mar 01 09:45:33 crc kubenswrapper[4792]: I0301 09:45:33.164461 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6s29r" event={"ID":"fab28167-0dde-44ec-a712-e11f418fd4e7","Type":"ContainerStarted","Data":"650137c017e7367d56531d13c48673396dc39fd07a4b1f503918cddfea0fc738"} Mar 01 09:45:34 crc kubenswrapper[4792]: I0301 09:45:34.174032 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6s29r" event={"ID":"fab28167-0dde-44ec-a712-e11f418fd4e7","Type":"ContainerStarted","Data":"9c71fd2aaaa034791780733eaa2cd9c185e726e89dd37e1afbe56a3c6596b906"} Mar 01 09:45:34 crc kubenswrapper[4792]: I0301 09:45:34.943601 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:45:34 crc kubenswrapper[4792]: I0301 09:45:34.943666 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:45:35 crc kubenswrapper[4792]: I0301 09:45:35.182171 4792 generic.go:334] "Generic (PLEG): container finished" podID="6c517000-6918-4f58-871b-7c4d26197ccf" containerID="9485400e953b4441f321cb6741c7780ced074d1502b17d2cd0f2132161817682" exitCode=0 Mar 01 09:45:35 crc kubenswrapper[4792]: I0301 09:45:35.183312 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" event={"ID":"6c517000-6918-4f58-871b-7c4d26197ccf","Type":"ContainerDied","Data":"9485400e953b4441f321cb6741c7780ced074d1502b17d2cd0f2132161817682"} Mar 01 09:45:36 crc kubenswrapper[4792]: I0301 09:45:36.193554 4792 generic.go:334] "Generic (PLEG): container finished" podID="fab28167-0dde-44ec-a712-e11f418fd4e7" containerID="9c71fd2aaaa034791780733eaa2cd9c185e726e89dd37e1afbe56a3c6596b906" exitCode=0 Mar 01 09:45:36 crc kubenswrapper[4792]: I0301 09:45:36.193657 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6s29r" event={"ID":"fab28167-0dde-44ec-a712-e11f418fd4e7","Type":"ContainerDied","Data":"9c71fd2aaaa034791780733eaa2cd9c185e726e89dd37e1afbe56a3c6596b906"} Mar 01 09:45:36 crc kubenswrapper[4792]: I0301 09:45:36.626950 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:36 crc kubenswrapper[4792]: I0301 09:45:36.690120 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-repo-setup-combined-ca-bundle\") pod \"6c517000-6918-4f58-871b-7c4d26197ccf\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " Mar 01 09:45:36 crc kubenswrapper[4792]: I0301 09:45:36.690167 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-ssh-key-openstack-edpm-ipam\") pod \"6c517000-6918-4f58-871b-7c4d26197ccf\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " Mar 01 09:45:36 crc kubenswrapper[4792]: I0301 09:45:36.690203 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-inventory\") pod \"6c517000-6918-4f58-871b-7c4d26197ccf\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " Mar 01 09:45:36 crc kubenswrapper[4792]: I0301 09:45:36.690278 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqf5q\" (UniqueName: \"kubernetes.io/projected/6c517000-6918-4f58-871b-7c4d26197ccf-kube-api-access-rqf5q\") pod \"6c517000-6918-4f58-871b-7c4d26197ccf\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " Mar 01 09:45:36 crc kubenswrapper[4792]: I0301 09:45:36.690631 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-ceph\") pod \"6c517000-6918-4f58-871b-7c4d26197ccf\" (UID: \"6c517000-6918-4f58-871b-7c4d26197ccf\") " Mar 01 09:45:36 crc kubenswrapper[4792]: I0301 09:45:36.697576 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-ceph" (OuterVolumeSpecName: "ceph") pod "6c517000-6918-4f58-871b-7c4d26197ccf" (UID: "6c517000-6918-4f58-871b-7c4d26197ccf"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:45:36 crc kubenswrapper[4792]: I0301 09:45:36.697883 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "6c517000-6918-4f58-871b-7c4d26197ccf" (UID: "6c517000-6918-4f58-871b-7c4d26197ccf"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:45:36 crc kubenswrapper[4792]: I0301 09:45:36.698110 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c517000-6918-4f58-871b-7c4d26197ccf-kube-api-access-rqf5q" (OuterVolumeSpecName: "kube-api-access-rqf5q") pod "6c517000-6918-4f58-871b-7c4d26197ccf" (UID: "6c517000-6918-4f58-871b-7c4d26197ccf"). InnerVolumeSpecName "kube-api-access-rqf5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:45:36 crc kubenswrapper[4792]: I0301 09:45:36.717272 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6c517000-6918-4f58-871b-7c4d26197ccf" (UID: "6c517000-6918-4f58-871b-7c4d26197ccf"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:45:36 crc kubenswrapper[4792]: I0301 09:45:36.722607 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-inventory" (OuterVolumeSpecName: "inventory") pod "6c517000-6918-4f58-871b-7c4d26197ccf" (UID: "6c517000-6918-4f58-871b-7c4d26197ccf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:45:36 crc kubenswrapper[4792]: I0301 09:45:36.792618 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqf5q\" (UniqueName: \"kubernetes.io/projected/6c517000-6918-4f58-871b-7c4d26197ccf-kube-api-access-rqf5q\") on node \"crc\" DevicePath \"\"" Mar 01 09:45:36 crc kubenswrapper[4792]: I0301 09:45:36.792888 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 09:45:36 crc kubenswrapper[4792]: I0301 09:45:36.792899 4792 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:45:36 crc kubenswrapper[4792]: I0301 09:45:36.792921 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:45:36 crc kubenswrapper[4792]: I0301 09:45:36.792930 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c517000-6918-4f58-871b-7c4d26197ccf-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.243699 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6s29r" event={"ID":"fab28167-0dde-44ec-a712-e11f418fd4e7","Type":"ContainerStarted","Data":"e15a487c524290dc29e46d692cc0b283d02c1c375a139b510bca5cd9870feae0"} Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.246993 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" event={"ID":"6c517000-6918-4f58-871b-7c4d26197ccf","Type":"ContainerDied","Data":"9c1974b1db8a8438cbc5a90dfa75c48b2a61a08d08a187b7c66e02d0f00f2799"} Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.247035 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c1974b1db8a8438cbc5a90dfa75c48b2a61a08d08a187b7c66e02d0f00f2799" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.247088 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.277919 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6s29r" podStartSLOduration=2.844325643 podStartE2EDuration="6.277890814s" podCreationTimestamp="2026-03-01 09:45:31 +0000 UTC" firstStartedPulling="2026-03-01 09:45:33.165812012 +0000 UTC m=+2262.407691229" lastFinishedPulling="2026-03-01 09:45:36.599377193 +0000 UTC m=+2265.841256400" observedRunningTime="2026-03-01 09:45:37.264167411 +0000 UTC m=+2266.506046608" watchObservedRunningTime="2026-03-01 09:45:37.277890814 +0000 UTC m=+2266.519770011" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.325050 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb"] Mar 01 09:45:37 crc kubenswrapper[4792]: E0301 09:45:37.325386 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c517000-6918-4f58-871b-7c4d26197ccf" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.325404 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c517000-6918-4f58-871b-7c4d26197ccf" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.325629 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c517000-6918-4f58-871b-7c4d26197ccf" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.326259 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.328876 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.329439 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.329695 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.329973 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.330200 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.342412 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb"] Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.409934 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx4hm\" (UniqueName: \"kubernetes.io/projected/1201ca91-41eb-45d0-991d-71883b4014ae-kube-api-access-vx4hm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.410253 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.410388 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.410589 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.410739 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.512462 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.512790 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.513340 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.513745 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.513976 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx4hm\" (UniqueName: \"kubernetes.io/projected/1201ca91-41eb-45d0-991d-71883b4014ae-kube-api-access-vx4hm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.517424 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.517658 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.517871 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.518107 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.534480 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx4hm\" (UniqueName: \"kubernetes.io/projected/1201ca91-41eb-45d0-991d-71883b4014ae-kube-api-access-vx4hm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:45:37 crc kubenswrapper[4792]: I0301 09:45:37.654839 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:45:38 crc kubenswrapper[4792]: I0301 09:45:38.172697 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb"] Mar 01 09:45:38 crc kubenswrapper[4792]: W0301 09:45:38.179335 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1201ca91_41eb_45d0_991d_71883b4014ae.slice/crio-d3a561fc80333203759a7141294eee19a809aea7b68c78b4e20c79a31fc48cc8 WatchSource:0}: Error finding container d3a561fc80333203759a7141294eee19a809aea7b68c78b4e20c79a31fc48cc8: Status 404 returned error can't find the container with id d3a561fc80333203759a7141294eee19a809aea7b68c78b4e20c79a31fc48cc8 Mar 01 09:45:38 crc kubenswrapper[4792]: I0301 09:45:38.255875 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" event={"ID":"1201ca91-41eb-45d0-991d-71883b4014ae","Type":"ContainerStarted","Data":"d3a561fc80333203759a7141294eee19a809aea7b68c78b4e20c79a31fc48cc8"} Mar 01 09:45:39 crc kubenswrapper[4792]: I0301 09:45:39.268109 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" event={"ID":"1201ca91-41eb-45d0-991d-71883b4014ae","Type":"ContainerStarted","Data":"f9bde77e46db7a7a43711f203d62d437c0037f5981ab90809f1911e681e22c3b"} Mar 01 09:45:39 crc kubenswrapper[4792]: I0301 09:45:39.302856 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" podStartSLOduration=1.8496646289999998 podStartE2EDuration="2.302827851s" podCreationTimestamp="2026-03-01 09:45:37 +0000 UTC" firstStartedPulling="2026-03-01 09:45:38.181811166 +0000 UTC m=+2267.423690363" lastFinishedPulling="2026-03-01 09:45:38.634974388 +0000 UTC m=+2267.876853585" observedRunningTime="2026-03-01 09:45:39.287543809 +0000 UTC m=+2268.529423036" watchObservedRunningTime="2026-03-01 09:45:39.302827851 +0000 UTC m=+2268.544707058" Mar 01 09:45:41 crc kubenswrapper[4792]: I0301 09:45:41.614659 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6s29r" Mar 01 09:45:41 crc kubenswrapper[4792]: I0301 09:45:41.614986 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6s29r" Mar 01 09:45:41 crc kubenswrapper[4792]: I0301 09:45:41.657569 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6s29r" Mar 01 09:45:42 crc kubenswrapper[4792]: I0301 09:45:42.336226 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6s29r" Mar 01 09:45:42 crc kubenswrapper[4792]: I0301 09:45:42.378386 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6s29r"] Mar 01 09:45:44 crc kubenswrapper[4792]: I0301 09:45:44.306744 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6s29r" podUID="fab28167-0dde-44ec-a712-e11f418fd4e7" containerName="registry-server" containerID="cri-o://e15a487c524290dc29e46d692cc0b283d02c1c375a139b510bca5cd9870feae0" gracePeriod=2 Mar 01 09:45:44 crc kubenswrapper[4792]: I0301 09:45:44.834671 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6s29r" Mar 01 09:45:44 crc kubenswrapper[4792]: I0301 09:45:44.968863 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqj95\" (UniqueName: \"kubernetes.io/projected/fab28167-0dde-44ec-a712-e11f418fd4e7-kube-api-access-gqj95\") pod \"fab28167-0dde-44ec-a712-e11f418fd4e7\" (UID: \"fab28167-0dde-44ec-a712-e11f418fd4e7\") " Mar 01 09:45:44 crc kubenswrapper[4792]: I0301 09:45:44.969205 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fab28167-0dde-44ec-a712-e11f418fd4e7-utilities\") pod \"fab28167-0dde-44ec-a712-e11f418fd4e7\" (UID: \"fab28167-0dde-44ec-a712-e11f418fd4e7\") " Mar 01 09:45:44 crc kubenswrapper[4792]: I0301 09:45:44.969239 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fab28167-0dde-44ec-a712-e11f418fd4e7-catalog-content\") pod \"fab28167-0dde-44ec-a712-e11f418fd4e7\" (UID: \"fab28167-0dde-44ec-a712-e11f418fd4e7\") " Mar 01 09:45:44 crc kubenswrapper[4792]: I0301 09:45:44.969967 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fab28167-0dde-44ec-a712-e11f418fd4e7-utilities" (OuterVolumeSpecName: "utilities") pod "fab28167-0dde-44ec-a712-e11f418fd4e7" (UID: "fab28167-0dde-44ec-a712-e11f418fd4e7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:45:44 crc kubenswrapper[4792]: I0301 09:45:44.976603 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fab28167-0dde-44ec-a712-e11f418fd4e7-kube-api-access-gqj95" (OuterVolumeSpecName: "kube-api-access-gqj95") pod "fab28167-0dde-44ec-a712-e11f418fd4e7" (UID: "fab28167-0dde-44ec-a712-e11f418fd4e7"). InnerVolumeSpecName "kube-api-access-gqj95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.071191 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fab28167-0dde-44ec-a712-e11f418fd4e7-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.071228 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqj95\" (UniqueName: \"kubernetes.io/projected/fab28167-0dde-44ec-a712-e11f418fd4e7-kube-api-access-gqj95\") on node \"crc\" DevicePath \"\"" Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.321376 4792 generic.go:334] "Generic (PLEG): container finished" podID="fab28167-0dde-44ec-a712-e11f418fd4e7" containerID="e15a487c524290dc29e46d692cc0b283d02c1c375a139b510bca5cd9870feae0" exitCode=0 Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.321432 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6s29r" event={"ID":"fab28167-0dde-44ec-a712-e11f418fd4e7","Type":"ContainerDied","Data":"e15a487c524290dc29e46d692cc0b283d02c1c375a139b510bca5cd9870feae0"} Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.321478 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6s29r" event={"ID":"fab28167-0dde-44ec-a712-e11f418fd4e7","Type":"ContainerDied","Data":"650137c017e7367d56531d13c48673396dc39fd07a4b1f503918cddfea0fc738"} Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.321480 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6s29r" Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.321494 4792 scope.go:117] "RemoveContainer" containerID="e15a487c524290dc29e46d692cc0b283d02c1c375a139b510bca5cd9870feae0" Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.346002 4792 scope.go:117] "RemoveContainer" containerID="9c71fd2aaaa034791780733eaa2cd9c185e726e89dd37e1afbe56a3c6596b906" Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.369125 4792 scope.go:117] "RemoveContainer" containerID="0e5e77d1eebb5d5009420264a58cc31b26cb46afeb4f4d6643cae7cd76caa0a9" Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.415698 4792 scope.go:117] "RemoveContainer" containerID="e15a487c524290dc29e46d692cc0b283d02c1c375a139b510bca5cd9870feae0" Mar 01 09:45:45 crc kubenswrapper[4792]: E0301 09:45:45.416360 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e15a487c524290dc29e46d692cc0b283d02c1c375a139b510bca5cd9870feae0\": container with ID starting with e15a487c524290dc29e46d692cc0b283d02c1c375a139b510bca5cd9870feae0 not found: ID does not exist" containerID="e15a487c524290dc29e46d692cc0b283d02c1c375a139b510bca5cd9870feae0" Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.416397 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e15a487c524290dc29e46d692cc0b283d02c1c375a139b510bca5cd9870feae0"} err="failed to get container status \"e15a487c524290dc29e46d692cc0b283d02c1c375a139b510bca5cd9870feae0\": rpc error: code = NotFound desc = could not find container \"e15a487c524290dc29e46d692cc0b283d02c1c375a139b510bca5cd9870feae0\": container with ID starting with e15a487c524290dc29e46d692cc0b283d02c1c375a139b510bca5cd9870feae0 not found: ID does not exist" Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.416450 4792 scope.go:117] "RemoveContainer" containerID="9c71fd2aaaa034791780733eaa2cd9c185e726e89dd37e1afbe56a3c6596b906" Mar 01 09:45:45 crc kubenswrapper[4792]: E0301 09:45:45.417159 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c71fd2aaaa034791780733eaa2cd9c185e726e89dd37e1afbe56a3c6596b906\": container with ID starting with 9c71fd2aaaa034791780733eaa2cd9c185e726e89dd37e1afbe56a3c6596b906 not found: ID does not exist" containerID="9c71fd2aaaa034791780733eaa2cd9c185e726e89dd37e1afbe56a3c6596b906" Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.417220 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c71fd2aaaa034791780733eaa2cd9c185e726e89dd37e1afbe56a3c6596b906"} err="failed to get container status \"9c71fd2aaaa034791780733eaa2cd9c185e726e89dd37e1afbe56a3c6596b906\": rpc error: code = NotFound desc = could not find container \"9c71fd2aaaa034791780733eaa2cd9c185e726e89dd37e1afbe56a3c6596b906\": container with ID starting with 9c71fd2aaaa034791780733eaa2cd9c185e726e89dd37e1afbe56a3c6596b906 not found: ID does not exist" Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.417254 4792 scope.go:117] "RemoveContainer" containerID="0e5e77d1eebb5d5009420264a58cc31b26cb46afeb4f4d6643cae7cd76caa0a9" Mar 01 09:45:45 crc kubenswrapper[4792]: E0301 09:45:45.417610 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e5e77d1eebb5d5009420264a58cc31b26cb46afeb4f4d6643cae7cd76caa0a9\": container with ID starting with 0e5e77d1eebb5d5009420264a58cc31b26cb46afeb4f4d6643cae7cd76caa0a9 not found: ID does not exist" containerID="0e5e77d1eebb5d5009420264a58cc31b26cb46afeb4f4d6643cae7cd76caa0a9" Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.417640 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e5e77d1eebb5d5009420264a58cc31b26cb46afeb4f4d6643cae7cd76caa0a9"} err="failed to get container status \"0e5e77d1eebb5d5009420264a58cc31b26cb46afeb4f4d6643cae7cd76caa0a9\": rpc error: code = NotFound desc = could not find container \"0e5e77d1eebb5d5009420264a58cc31b26cb46afeb4f4d6643cae7cd76caa0a9\": container with ID starting with 0e5e77d1eebb5d5009420264a58cc31b26cb46afeb4f4d6643cae7cd76caa0a9 not found: ID does not exist" Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.489414 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fab28167-0dde-44ec-a712-e11f418fd4e7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fab28167-0dde-44ec-a712-e11f418fd4e7" (UID: "fab28167-0dde-44ec-a712-e11f418fd4e7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.580307 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fab28167-0dde-44ec-a712-e11f418fd4e7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.661746 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6s29r"] Mar 01 09:45:45 crc kubenswrapper[4792]: I0301 09:45:45.673480 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6s29r"] Mar 01 09:45:47 crc kubenswrapper[4792]: I0301 09:45:47.420571 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fab28167-0dde-44ec-a712-e11f418fd4e7" path="/var/lib/kubelet/pods/fab28167-0dde-44ec-a712-e11f418fd4e7/volumes" Mar 01 09:46:00 crc kubenswrapper[4792]: I0301 09:46:00.132295 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539306-tt9nk"] Mar 01 09:46:00 crc kubenswrapper[4792]: E0301 09:46:00.133226 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fab28167-0dde-44ec-a712-e11f418fd4e7" containerName="extract-utilities" Mar 01 09:46:00 crc kubenswrapper[4792]: I0301 09:46:00.133244 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="fab28167-0dde-44ec-a712-e11f418fd4e7" containerName="extract-utilities" Mar 01 09:46:00 crc kubenswrapper[4792]: E0301 09:46:00.133270 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fab28167-0dde-44ec-a712-e11f418fd4e7" containerName="extract-content" Mar 01 09:46:00 crc kubenswrapper[4792]: I0301 09:46:00.133279 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="fab28167-0dde-44ec-a712-e11f418fd4e7" containerName="extract-content" Mar 01 09:46:00 crc kubenswrapper[4792]: E0301 09:46:00.133291 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fab28167-0dde-44ec-a712-e11f418fd4e7" containerName="registry-server" Mar 01 09:46:00 crc kubenswrapper[4792]: I0301 09:46:00.133298 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="fab28167-0dde-44ec-a712-e11f418fd4e7" containerName="registry-server" Mar 01 09:46:00 crc kubenswrapper[4792]: I0301 09:46:00.133502 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="fab28167-0dde-44ec-a712-e11f418fd4e7" containerName="registry-server" Mar 01 09:46:00 crc kubenswrapper[4792]: I0301 09:46:00.134178 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539306-tt9nk" Mar 01 09:46:00 crc kubenswrapper[4792]: I0301 09:46:00.136066 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:46:00 crc kubenswrapper[4792]: I0301 09:46:00.139234 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:46:00 crc kubenswrapper[4792]: I0301 09:46:00.139550 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:46:00 crc kubenswrapper[4792]: I0301 09:46:00.141849 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539306-tt9nk"] Mar 01 09:46:00 crc kubenswrapper[4792]: I0301 09:46:00.237958 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-462ld\" (UniqueName: \"kubernetes.io/projected/f5d29db8-7573-4364-9e18-20658b790d1f-kube-api-access-462ld\") pod \"auto-csr-approver-29539306-tt9nk\" (UID: \"f5d29db8-7573-4364-9e18-20658b790d1f\") " pod="openshift-infra/auto-csr-approver-29539306-tt9nk" Mar 01 09:46:00 crc kubenswrapper[4792]: I0301 09:46:00.339605 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-462ld\" (UniqueName: \"kubernetes.io/projected/f5d29db8-7573-4364-9e18-20658b790d1f-kube-api-access-462ld\") pod \"auto-csr-approver-29539306-tt9nk\" (UID: \"f5d29db8-7573-4364-9e18-20658b790d1f\") " pod="openshift-infra/auto-csr-approver-29539306-tt9nk" Mar 01 09:46:00 crc kubenswrapper[4792]: I0301 09:46:00.358735 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-462ld\" (UniqueName: \"kubernetes.io/projected/f5d29db8-7573-4364-9e18-20658b790d1f-kube-api-access-462ld\") pod \"auto-csr-approver-29539306-tt9nk\" (UID: \"f5d29db8-7573-4364-9e18-20658b790d1f\") " pod="openshift-infra/auto-csr-approver-29539306-tt9nk" Mar 01 09:46:00 crc kubenswrapper[4792]: I0301 09:46:00.449653 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539306-tt9nk" Mar 01 09:46:00 crc kubenswrapper[4792]: I0301 09:46:00.880301 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539306-tt9nk"] Mar 01 09:46:01 crc kubenswrapper[4792]: I0301 09:46:01.446255 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539306-tt9nk" event={"ID":"f5d29db8-7573-4364-9e18-20658b790d1f","Type":"ContainerStarted","Data":"0bedb45f125679d6b8c0bd25245b8ca7d427fb9cc496e01b3f86aa772d139092"} Mar 01 09:46:02 crc kubenswrapper[4792]: E0301 09:46:02.300030 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5d29db8_7573_4364_9e18_20658b790d1f.slice/crio-conmon-739afaccd13faa05c6c15e2c6b70ac689c35aa13a309f0b869c97a20dddff65e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5d29db8_7573_4364_9e18_20658b790d1f.slice/crio-739afaccd13faa05c6c15e2c6b70ac689c35aa13a309f0b869c97a20dddff65e.scope\": RecentStats: unable to find data in memory cache]" Mar 01 09:46:02 crc kubenswrapper[4792]: I0301 09:46:02.455957 4792 generic.go:334] "Generic (PLEG): container finished" podID="f5d29db8-7573-4364-9e18-20658b790d1f" containerID="739afaccd13faa05c6c15e2c6b70ac689c35aa13a309f0b869c97a20dddff65e" exitCode=0 Mar 01 09:46:02 crc kubenswrapper[4792]: I0301 09:46:02.456003 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539306-tt9nk" event={"ID":"f5d29db8-7573-4364-9e18-20658b790d1f","Type":"ContainerDied","Data":"739afaccd13faa05c6c15e2c6b70ac689c35aa13a309f0b869c97a20dddff65e"} Mar 01 09:46:03 crc kubenswrapper[4792]: I0301 09:46:03.769253 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539306-tt9nk" Mar 01 09:46:03 crc kubenswrapper[4792]: I0301 09:46:03.904018 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-462ld\" (UniqueName: \"kubernetes.io/projected/f5d29db8-7573-4364-9e18-20658b790d1f-kube-api-access-462ld\") pod \"f5d29db8-7573-4364-9e18-20658b790d1f\" (UID: \"f5d29db8-7573-4364-9e18-20658b790d1f\") " Mar 01 09:46:03 crc kubenswrapper[4792]: I0301 09:46:03.909194 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5d29db8-7573-4364-9e18-20658b790d1f-kube-api-access-462ld" (OuterVolumeSpecName: "kube-api-access-462ld") pod "f5d29db8-7573-4364-9e18-20658b790d1f" (UID: "f5d29db8-7573-4364-9e18-20658b790d1f"). InnerVolumeSpecName "kube-api-access-462ld". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:46:04 crc kubenswrapper[4792]: I0301 09:46:04.006188 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-462ld\" (UniqueName: \"kubernetes.io/projected/f5d29db8-7573-4364-9e18-20658b790d1f-kube-api-access-462ld\") on node \"crc\" DevicePath \"\"" Mar 01 09:46:04 crc kubenswrapper[4792]: I0301 09:46:04.471572 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539306-tt9nk" event={"ID":"f5d29db8-7573-4364-9e18-20658b790d1f","Type":"ContainerDied","Data":"0bedb45f125679d6b8c0bd25245b8ca7d427fb9cc496e01b3f86aa772d139092"} Mar 01 09:46:04 crc kubenswrapper[4792]: I0301 09:46:04.471610 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bedb45f125679d6b8c0bd25245b8ca7d427fb9cc496e01b3f86aa772d139092" Mar 01 09:46:04 crc kubenswrapper[4792]: I0301 09:46:04.471631 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539306-tt9nk" Mar 01 09:46:04 crc kubenswrapper[4792]: I0301 09:46:04.843026 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539300-ws5xb"] Mar 01 09:46:04 crc kubenswrapper[4792]: I0301 09:46:04.851284 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539300-ws5xb"] Mar 01 09:46:04 crc kubenswrapper[4792]: I0301 09:46:04.942456 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:46:04 crc kubenswrapper[4792]: I0301 09:46:04.942498 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:46:04 crc kubenswrapper[4792]: I0301 09:46:04.942532 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:46:04 crc kubenswrapper[4792]: I0301 09:46:04.943175 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"80f7f8ed75933ce29e4ba1caec37158448f83396fe1f5a8bba0233aec8df1ec7"} pod="openshift-machine-config-operator/machine-config-daemon-bqszv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 01 09:46:04 crc kubenswrapper[4792]: I0301 09:46:04.943217 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" containerID="cri-o://80f7f8ed75933ce29e4ba1caec37158448f83396fe1f5a8bba0233aec8df1ec7" gracePeriod=600 Mar 01 09:46:05 crc kubenswrapper[4792]: I0301 09:46:05.432228 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79584910-9524-4e1c-8edf-5411aa71eb0a" path="/var/lib/kubelet/pods/79584910-9524-4e1c-8edf-5411aa71eb0a/volumes" Mar 01 09:46:05 crc kubenswrapper[4792]: I0301 09:46:05.484257 4792 generic.go:334] "Generic (PLEG): container finished" podID="9105f6b0-6f16-47aa-8009-73736a90b765" containerID="80f7f8ed75933ce29e4ba1caec37158448f83396fe1f5a8bba0233aec8df1ec7" exitCode=0 Mar 01 09:46:05 crc kubenswrapper[4792]: I0301 09:46:05.484307 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerDied","Data":"80f7f8ed75933ce29e4ba1caec37158448f83396fe1f5a8bba0233aec8df1ec7"} Mar 01 09:46:05 crc kubenswrapper[4792]: I0301 09:46:05.484413 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983"} Mar 01 09:46:05 crc kubenswrapper[4792]: I0301 09:46:05.484478 4792 scope.go:117] "RemoveContainer" containerID="60fb953fe0649dcf15e296ccf0840ddc5d89a6b945dcfc20bd3c7a7de54e603e" Mar 01 09:46:27 crc kubenswrapper[4792]: I0301 09:46:27.883524 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6n47w"] Mar 01 09:46:27 crc kubenswrapper[4792]: E0301 09:46:27.884275 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5d29db8-7573-4364-9e18-20658b790d1f" containerName="oc" Mar 01 09:46:27 crc kubenswrapper[4792]: I0301 09:46:27.884287 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5d29db8-7573-4364-9e18-20658b790d1f" containerName="oc" Mar 01 09:46:27 crc kubenswrapper[4792]: I0301 09:46:27.884457 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5d29db8-7573-4364-9e18-20658b790d1f" containerName="oc" Mar 01 09:46:27 crc kubenswrapper[4792]: I0301 09:46:27.885596 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6n47w" Mar 01 09:46:27 crc kubenswrapper[4792]: I0301 09:46:27.894090 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6n47w"] Mar 01 09:46:27 crc kubenswrapper[4792]: I0301 09:46:27.955989 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54fe8edc-deda-4a44-b14f-263f77d4c545-catalog-content\") pod \"redhat-marketplace-6n47w\" (UID: \"54fe8edc-deda-4a44-b14f-263f77d4c545\") " pod="openshift-marketplace/redhat-marketplace-6n47w" Mar 01 09:46:27 crc kubenswrapper[4792]: I0301 09:46:27.956333 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54fe8edc-deda-4a44-b14f-263f77d4c545-utilities\") pod \"redhat-marketplace-6n47w\" (UID: \"54fe8edc-deda-4a44-b14f-263f77d4c545\") " pod="openshift-marketplace/redhat-marketplace-6n47w" Mar 01 09:46:27 crc kubenswrapper[4792]: I0301 09:46:27.956407 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh5px\" (UniqueName: \"kubernetes.io/projected/54fe8edc-deda-4a44-b14f-263f77d4c545-kube-api-access-dh5px\") pod \"redhat-marketplace-6n47w\" (UID: \"54fe8edc-deda-4a44-b14f-263f77d4c545\") " pod="openshift-marketplace/redhat-marketplace-6n47w" Mar 01 09:46:28 crc kubenswrapper[4792]: I0301 09:46:28.058017 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54fe8edc-deda-4a44-b14f-263f77d4c545-catalog-content\") pod \"redhat-marketplace-6n47w\" (UID: \"54fe8edc-deda-4a44-b14f-263f77d4c545\") " pod="openshift-marketplace/redhat-marketplace-6n47w" Mar 01 09:46:28 crc kubenswrapper[4792]: I0301 09:46:28.058115 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54fe8edc-deda-4a44-b14f-263f77d4c545-utilities\") pod \"redhat-marketplace-6n47w\" (UID: \"54fe8edc-deda-4a44-b14f-263f77d4c545\") " pod="openshift-marketplace/redhat-marketplace-6n47w" Mar 01 09:46:28 crc kubenswrapper[4792]: I0301 09:46:28.058181 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh5px\" (UniqueName: \"kubernetes.io/projected/54fe8edc-deda-4a44-b14f-263f77d4c545-kube-api-access-dh5px\") pod \"redhat-marketplace-6n47w\" (UID: \"54fe8edc-deda-4a44-b14f-263f77d4c545\") " pod="openshift-marketplace/redhat-marketplace-6n47w" Mar 01 09:46:28 crc kubenswrapper[4792]: I0301 09:46:28.058588 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54fe8edc-deda-4a44-b14f-263f77d4c545-utilities\") pod \"redhat-marketplace-6n47w\" (UID: \"54fe8edc-deda-4a44-b14f-263f77d4c545\") " pod="openshift-marketplace/redhat-marketplace-6n47w" Mar 01 09:46:28 crc kubenswrapper[4792]: I0301 09:46:28.058621 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54fe8edc-deda-4a44-b14f-263f77d4c545-catalog-content\") pod \"redhat-marketplace-6n47w\" (UID: \"54fe8edc-deda-4a44-b14f-263f77d4c545\") " pod="openshift-marketplace/redhat-marketplace-6n47w" Mar 01 09:46:28 crc kubenswrapper[4792]: I0301 09:46:28.084516 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh5px\" (UniqueName: \"kubernetes.io/projected/54fe8edc-deda-4a44-b14f-263f77d4c545-kube-api-access-dh5px\") pod \"redhat-marketplace-6n47w\" (UID: \"54fe8edc-deda-4a44-b14f-263f77d4c545\") " pod="openshift-marketplace/redhat-marketplace-6n47w" Mar 01 09:46:28 crc kubenswrapper[4792]: I0301 09:46:28.205349 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6n47w" Mar 01 09:46:28 crc kubenswrapper[4792]: I0301 09:46:28.719131 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6n47w"] Mar 01 09:46:29 crc kubenswrapper[4792]: I0301 09:46:29.686490 4792 generic.go:334] "Generic (PLEG): container finished" podID="54fe8edc-deda-4a44-b14f-263f77d4c545" containerID="167b6d9651d45aa34b171933ad6a0854589140bcc6856976316b5cce113f3f16" exitCode=0 Mar 01 09:46:29 crc kubenswrapper[4792]: I0301 09:46:29.686744 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6n47w" event={"ID":"54fe8edc-deda-4a44-b14f-263f77d4c545","Type":"ContainerDied","Data":"167b6d9651d45aa34b171933ad6a0854589140bcc6856976316b5cce113f3f16"} Mar 01 09:46:29 crc kubenswrapper[4792]: I0301 09:46:29.686794 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6n47w" event={"ID":"54fe8edc-deda-4a44-b14f-263f77d4c545","Type":"ContainerStarted","Data":"aeb132f4e48cdf93635bdad095b234bbcae82fcff8988e2298e8a24863eaf81c"} Mar 01 09:46:30 crc kubenswrapper[4792]: I0301 09:46:30.695027 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6n47w" event={"ID":"54fe8edc-deda-4a44-b14f-263f77d4c545","Type":"ContainerStarted","Data":"c398ffc0f4f3830fdd01b5402a7b585b2e9ced0443568ac24dd95327806a3b07"} Mar 01 09:46:30 crc kubenswrapper[4792]: I0301 09:46:30.811070 4792 scope.go:117] "RemoveContainer" containerID="c7d695ec4b923c131de05f4b3929cfdf1d810db81219d72b727b76b052713eaf" Mar 01 09:46:30 crc kubenswrapper[4792]: I0301 09:46:30.848204 4792 scope.go:117] "RemoveContainer" containerID="4100c168a5febe541b2b6fdd770ebafed4e19b504bd5f0104ce3f530de9d8c6d" Mar 01 09:46:30 crc kubenswrapper[4792]: I0301 09:46:30.905180 4792 scope.go:117] "RemoveContainer" containerID="8579b82c3f2aeab429db244bb7b4d62bd406e57babe3af839bf5e91664e2433c" Mar 01 09:46:31 crc kubenswrapper[4792]: I0301 09:46:31.703460 4792 generic.go:334] "Generic (PLEG): container finished" podID="54fe8edc-deda-4a44-b14f-263f77d4c545" containerID="c398ffc0f4f3830fdd01b5402a7b585b2e9ced0443568ac24dd95327806a3b07" exitCode=0 Mar 01 09:46:31 crc kubenswrapper[4792]: I0301 09:46:31.703499 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6n47w" event={"ID":"54fe8edc-deda-4a44-b14f-263f77d4c545","Type":"ContainerDied","Data":"c398ffc0f4f3830fdd01b5402a7b585b2e9ced0443568ac24dd95327806a3b07"} Mar 01 09:46:32 crc kubenswrapper[4792]: I0301 09:46:32.716038 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6n47w" event={"ID":"54fe8edc-deda-4a44-b14f-263f77d4c545","Type":"ContainerStarted","Data":"777633989bccc7df92624b56ebe9fc99a3b70de7c01e07c533c777807881f6dd"} Mar 01 09:46:32 crc kubenswrapper[4792]: I0301 09:46:32.741495 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6n47w" podStartSLOduration=3.262183815 podStartE2EDuration="5.741461864s" podCreationTimestamp="2026-03-01 09:46:27 +0000 UTC" firstStartedPulling="2026-03-01 09:46:29.690345433 +0000 UTC m=+2318.932224630" lastFinishedPulling="2026-03-01 09:46:32.169623472 +0000 UTC m=+2321.411502679" observedRunningTime="2026-03-01 09:46:32.739493665 +0000 UTC m=+2321.981372912" watchObservedRunningTime="2026-03-01 09:46:32.741461864 +0000 UTC m=+2321.983341071" Mar 01 09:46:38 crc kubenswrapper[4792]: I0301 09:46:38.205806 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6n47w" Mar 01 09:46:38 crc kubenswrapper[4792]: I0301 09:46:38.207515 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6n47w" Mar 01 09:46:38 crc kubenswrapper[4792]: I0301 09:46:38.252557 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6n47w" Mar 01 09:46:38 crc kubenswrapper[4792]: I0301 09:46:38.805387 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6n47w" Mar 01 09:46:38 crc kubenswrapper[4792]: I0301 09:46:38.857764 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6n47w"] Mar 01 09:46:40 crc kubenswrapper[4792]: I0301 09:46:40.770493 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6n47w" podUID="54fe8edc-deda-4a44-b14f-263f77d4c545" containerName="registry-server" containerID="cri-o://777633989bccc7df92624b56ebe9fc99a3b70de7c01e07c533c777807881f6dd" gracePeriod=2 Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.286342 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6n47w" Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.412375 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dh5px\" (UniqueName: \"kubernetes.io/projected/54fe8edc-deda-4a44-b14f-263f77d4c545-kube-api-access-dh5px\") pod \"54fe8edc-deda-4a44-b14f-263f77d4c545\" (UID: \"54fe8edc-deda-4a44-b14f-263f77d4c545\") " Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.412536 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54fe8edc-deda-4a44-b14f-263f77d4c545-utilities\") pod \"54fe8edc-deda-4a44-b14f-263f77d4c545\" (UID: \"54fe8edc-deda-4a44-b14f-263f77d4c545\") " Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.412591 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54fe8edc-deda-4a44-b14f-263f77d4c545-catalog-content\") pod \"54fe8edc-deda-4a44-b14f-263f77d4c545\" (UID: \"54fe8edc-deda-4a44-b14f-263f77d4c545\") " Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.416223 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54fe8edc-deda-4a44-b14f-263f77d4c545-utilities" (OuterVolumeSpecName: "utilities") pod "54fe8edc-deda-4a44-b14f-263f77d4c545" (UID: "54fe8edc-deda-4a44-b14f-263f77d4c545"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.420303 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54fe8edc-deda-4a44-b14f-263f77d4c545-kube-api-access-dh5px" (OuterVolumeSpecName: "kube-api-access-dh5px") pod "54fe8edc-deda-4a44-b14f-263f77d4c545" (UID: "54fe8edc-deda-4a44-b14f-263f77d4c545"). InnerVolumeSpecName "kube-api-access-dh5px". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.440998 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54fe8edc-deda-4a44-b14f-263f77d4c545-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "54fe8edc-deda-4a44-b14f-263f77d4c545" (UID: "54fe8edc-deda-4a44-b14f-263f77d4c545"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.514894 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54fe8edc-deda-4a44-b14f-263f77d4c545-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.514938 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54fe8edc-deda-4a44-b14f-263f77d4c545-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.514954 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dh5px\" (UniqueName: \"kubernetes.io/projected/54fe8edc-deda-4a44-b14f-263f77d4c545-kube-api-access-dh5px\") on node \"crc\" DevicePath \"\"" Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.787773 4792 generic.go:334] "Generic (PLEG): container finished" podID="54fe8edc-deda-4a44-b14f-263f77d4c545" containerID="777633989bccc7df92624b56ebe9fc99a3b70de7c01e07c533c777807881f6dd" exitCode=0 Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.788015 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6n47w" event={"ID":"54fe8edc-deda-4a44-b14f-263f77d4c545","Type":"ContainerDied","Data":"777633989bccc7df92624b56ebe9fc99a3b70de7c01e07c533c777807881f6dd"} Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.788148 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6n47w" Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.788158 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6n47w" event={"ID":"54fe8edc-deda-4a44-b14f-263f77d4c545","Type":"ContainerDied","Data":"aeb132f4e48cdf93635bdad095b234bbcae82fcff8988e2298e8a24863eaf81c"} Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.788182 4792 scope.go:117] "RemoveContainer" containerID="777633989bccc7df92624b56ebe9fc99a3b70de7c01e07c533c777807881f6dd" Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.816369 4792 scope.go:117] "RemoveContainer" containerID="c398ffc0f4f3830fdd01b5402a7b585b2e9ced0443568ac24dd95327806a3b07" Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.818746 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6n47w"] Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.825891 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6n47w"] Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.843932 4792 scope.go:117] "RemoveContainer" containerID="167b6d9651d45aa34b171933ad6a0854589140bcc6856976316b5cce113f3f16" Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.867257 4792 scope.go:117] "RemoveContainer" containerID="777633989bccc7df92624b56ebe9fc99a3b70de7c01e07c533c777807881f6dd" Mar 01 09:46:41 crc kubenswrapper[4792]: E0301 09:46:41.867720 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"777633989bccc7df92624b56ebe9fc99a3b70de7c01e07c533c777807881f6dd\": container with ID starting with 777633989bccc7df92624b56ebe9fc99a3b70de7c01e07c533c777807881f6dd not found: ID does not exist" containerID="777633989bccc7df92624b56ebe9fc99a3b70de7c01e07c533c777807881f6dd" Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.867749 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"777633989bccc7df92624b56ebe9fc99a3b70de7c01e07c533c777807881f6dd"} err="failed to get container status \"777633989bccc7df92624b56ebe9fc99a3b70de7c01e07c533c777807881f6dd\": rpc error: code = NotFound desc = could not find container \"777633989bccc7df92624b56ebe9fc99a3b70de7c01e07c533c777807881f6dd\": container with ID starting with 777633989bccc7df92624b56ebe9fc99a3b70de7c01e07c533c777807881f6dd not found: ID does not exist" Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.867770 4792 scope.go:117] "RemoveContainer" containerID="c398ffc0f4f3830fdd01b5402a7b585b2e9ced0443568ac24dd95327806a3b07" Mar 01 09:46:41 crc kubenswrapper[4792]: E0301 09:46:41.868178 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c398ffc0f4f3830fdd01b5402a7b585b2e9ced0443568ac24dd95327806a3b07\": container with ID starting with c398ffc0f4f3830fdd01b5402a7b585b2e9ced0443568ac24dd95327806a3b07 not found: ID does not exist" containerID="c398ffc0f4f3830fdd01b5402a7b585b2e9ced0443568ac24dd95327806a3b07" Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.868207 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c398ffc0f4f3830fdd01b5402a7b585b2e9ced0443568ac24dd95327806a3b07"} err="failed to get container status \"c398ffc0f4f3830fdd01b5402a7b585b2e9ced0443568ac24dd95327806a3b07\": rpc error: code = NotFound desc = could not find container \"c398ffc0f4f3830fdd01b5402a7b585b2e9ced0443568ac24dd95327806a3b07\": container with ID starting with c398ffc0f4f3830fdd01b5402a7b585b2e9ced0443568ac24dd95327806a3b07 not found: ID does not exist" Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.868219 4792 scope.go:117] "RemoveContainer" containerID="167b6d9651d45aa34b171933ad6a0854589140bcc6856976316b5cce113f3f16" Mar 01 09:46:41 crc kubenswrapper[4792]: E0301 09:46:41.868392 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"167b6d9651d45aa34b171933ad6a0854589140bcc6856976316b5cce113f3f16\": container with ID starting with 167b6d9651d45aa34b171933ad6a0854589140bcc6856976316b5cce113f3f16 not found: ID does not exist" containerID="167b6d9651d45aa34b171933ad6a0854589140bcc6856976316b5cce113f3f16" Mar 01 09:46:41 crc kubenswrapper[4792]: I0301 09:46:41.868419 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"167b6d9651d45aa34b171933ad6a0854589140bcc6856976316b5cce113f3f16"} err="failed to get container status \"167b6d9651d45aa34b171933ad6a0854589140bcc6856976316b5cce113f3f16\": rpc error: code = NotFound desc = could not find container \"167b6d9651d45aa34b171933ad6a0854589140bcc6856976316b5cce113f3f16\": container with ID starting with 167b6d9651d45aa34b171933ad6a0854589140bcc6856976316b5cce113f3f16 not found: ID does not exist" Mar 01 09:46:43 crc kubenswrapper[4792]: I0301 09:46:43.421518 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54fe8edc-deda-4a44-b14f-263f77d4c545" path="/var/lib/kubelet/pods/54fe8edc-deda-4a44-b14f-263f77d4c545/volumes" Mar 01 09:46:43 crc kubenswrapper[4792]: I0301 09:46:43.892952 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ngjhc"] Mar 01 09:46:43 crc kubenswrapper[4792]: E0301 09:46:43.893331 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54fe8edc-deda-4a44-b14f-263f77d4c545" containerName="registry-server" Mar 01 09:46:43 crc kubenswrapper[4792]: I0301 09:46:43.893345 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="54fe8edc-deda-4a44-b14f-263f77d4c545" containerName="registry-server" Mar 01 09:46:43 crc kubenswrapper[4792]: E0301 09:46:43.893372 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54fe8edc-deda-4a44-b14f-263f77d4c545" containerName="extract-content" Mar 01 09:46:43 crc kubenswrapper[4792]: I0301 09:46:43.893378 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="54fe8edc-deda-4a44-b14f-263f77d4c545" containerName="extract-content" Mar 01 09:46:43 crc kubenswrapper[4792]: E0301 09:46:43.893386 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54fe8edc-deda-4a44-b14f-263f77d4c545" containerName="extract-utilities" Mar 01 09:46:43 crc kubenswrapper[4792]: I0301 09:46:43.893392 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="54fe8edc-deda-4a44-b14f-263f77d4c545" containerName="extract-utilities" Mar 01 09:46:43 crc kubenswrapper[4792]: I0301 09:46:43.893546 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="54fe8edc-deda-4a44-b14f-263f77d4c545" containerName="registry-server" Mar 01 09:46:43 crc kubenswrapper[4792]: I0301 09:46:43.894700 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ngjhc" Mar 01 09:46:43 crc kubenswrapper[4792]: I0301 09:46:43.913164 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ngjhc"] Mar 01 09:46:43 crc kubenswrapper[4792]: I0301 09:46:43.960969 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3-catalog-content\") pod \"certified-operators-ngjhc\" (UID: \"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3\") " pod="openshift-marketplace/certified-operators-ngjhc" Mar 01 09:46:43 crc kubenswrapper[4792]: I0301 09:46:43.961027 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3-utilities\") pod \"certified-operators-ngjhc\" (UID: \"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3\") " pod="openshift-marketplace/certified-operators-ngjhc" Mar 01 09:46:43 crc kubenswrapper[4792]: I0301 09:46:43.961409 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kp4d\" (UniqueName: \"kubernetes.io/projected/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3-kube-api-access-8kp4d\") pod \"certified-operators-ngjhc\" (UID: \"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3\") " pod="openshift-marketplace/certified-operators-ngjhc" Mar 01 09:46:44 crc kubenswrapper[4792]: I0301 09:46:44.063411 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kp4d\" (UniqueName: \"kubernetes.io/projected/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3-kube-api-access-8kp4d\") pod \"certified-operators-ngjhc\" (UID: \"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3\") " pod="openshift-marketplace/certified-operators-ngjhc" Mar 01 09:46:44 crc kubenswrapper[4792]: I0301 09:46:44.063505 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3-catalog-content\") pod \"certified-operators-ngjhc\" (UID: \"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3\") " pod="openshift-marketplace/certified-operators-ngjhc" Mar 01 09:46:44 crc kubenswrapper[4792]: I0301 09:46:44.063527 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3-utilities\") pod \"certified-operators-ngjhc\" (UID: \"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3\") " pod="openshift-marketplace/certified-operators-ngjhc" Mar 01 09:46:44 crc kubenswrapper[4792]: I0301 09:46:44.064106 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3-utilities\") pod \"certified-operators-ngjhc\" (UID: \"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3\") " pod="openshift-marketplace/certified-operators-ngjhc" Mar 01 09:46:44 crc kubenswrapper[4792]: I0301 09:46:44.064266 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3-catalog-content\") pod \"certified-operators-ngjhc\" (UID: \"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3\") " pod="openshift-marketplace/certified-operators-ngjhc" Mar 01 09:46:44 crc kubenswrapper[4792]: I0301 09:46:44.080978 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kp4d\" (UniqueName: \"kubernetes.io/projected/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3-kube-api-access-8kp4d\") pod \"certified-operators-ngjhc\" (UID: \"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3\") " pod="openshift-marketplace/certified-operators-ngjhc" Mar 01 09:46:44 crc kubenswrapper[4792]: I0301 09:46:44.218215 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ngjhc" Mar 01 09:46:44 crc kubenswrapper[4792]: I0301 09:46:44.817396 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ngjhc"] Mar 01 09:46:45 crc kubenswrapper[4792]: I0301 09:46:45.813884 4792 generic.go:334] "Generic (PLEG): container finished" podID="d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3" containerID="4d936e88bfe740547ec3ad69b38a532ec388cf67e43e3209ea871881ebe7b0b3" exitCode=0 Mar 01 09:46:45 crc kubenswrapper[4792]: I0301 09:46:45.814048 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngjhc" event={"ID":"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3","Type":"ContainerDied","Data":"4d936e88bfe740547ec3ad69b38a532ec388cf67e43e3209ea871881ebe7b0b3"} Mar 01 09:46:45 crc kubenswrapper[4792]: I0301 09:46:45.814420 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngjhc" event={"ID":"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3","Type":"ContainerStarted","Data":"69f0c5dc0c1619083b7daceb6ad82ab03f1d67532fb7c06ccbeefe11fdc99439"} Mar 01 09:46:46 crc kubenswrapper[4792]: I0301 09:46:46.823335 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngjhc" event={"ID":"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3","Type":"ContainerStarted","Data":"6f64b6331ce23adb7d1f2935446438b352d2e4ffc202dfc8aceabf6b55312303"} Mar 01 09:46:50 crc kubenswrapper[4792]: I0301 09:46:50.854602 4792 generic.go:334] "Generic (PLEG): container finished" podID="d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3" containerID="6f64b6331ce23adb7d1f2935446438b352d2e4ffc202dfc8aceabf6b55312303" exitCode=0 Mar 01 09:46:50 crc kubenswrapper[4792]: I0301 09:46:50.854794 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngjhc" event={"ID":"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3","Type":"ContainerDied","Data":"6f64b6331ce23adb7d1f2935446438b352d2e4ffc202dfc8aceabf6b55312303"} Mar 01 09:46:52 crc kubenswrapper[4792]: I0301 09:46:52.872605 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngjhc" event={"ID":"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3","Type":"ContainerStarted","Data":"ac24232ff9dc9a3289df755cfb01ddcde4ad46d0fb7b6c271ed181ad1e56d80e"} Mar 01 09:46:52 crc kubenswrapper[4792]: I0301 09:46:52.902258 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ngjhc" podStartSLOduration=3.439604086 podStartE2EDuration="9.902238425s" podCreationTimestamp="2026-03-01 09:46:43 +0000 UTC" firstStartedPulling="2026-03-01 09:46:45.815958058 +0000 UTC m=+2335.057837265" lastFinishedPulling="2026-03-01 09:46:52.278592407 +0000 UTC m=+2341.520471604" observedRunningTime="2026-03-01 09:46:52.899580779 +0000 UTC m=+2342.141459976" watchObservedRunningTime="2026-03-01 09:46:52.902238425 +0000 UTC m=+2342.144117622" Mar 01 09:46:54 crc kubenswrapper[4792]: I0301 09:46:54.218841 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ngjhc" Mar 01 09:46:54 crc kubenswrapper[4792]: I0301 09:46:54.219213 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ngjhc" Mar 01 09:46:55 crc kubenswrapper[4792]: I0301 09:46:55.265517 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-ngjhc" podUID="d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3" containerName="registry-server" probeResult="failure" output=< Mar 01 09:46:55 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 09:46:55 crc kubenswrapper[4792]: > Mar 01 09:47:04 crc kubenswrapper[4792]: I0301 09:47:04.261778 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ngjhc" Mar 01 09:47:04 crc kubenswrapper[4792]: I0301 09:47:04.314063 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ngjhc" Mar 01 09:47:04 crc kubenswrapper[4792]: I0301 09:47:04.496292 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ngjhc"] Mar 01 09:47:05 crc kubenswrapper[4792]: I0301 09:47:05.959938 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ngjhc" podUID="d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3" containerName="registry-server" containerID="cri-o://ac24232ff9dc9a3289df755cfb01ddcde4ad46d0fb7b6c271ed181ad1e56d80e" gracePeriod=2 Mar 01 09:47:06 crc kubenswrapper[4792]: I0301 09:47:06.386334 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ngjhc" Mar 01 09:47:06 crc kubenswrapper[4792]: I0301 09:47:06.559664 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kp4d\" (UniqueName: \"kubernetes.io/projected/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3-kube-api-access-8kp4d\") pod \"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3\" (UID: \"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3\") " Mar 01 09:47:06 crc kubenswrapper[4792]: I0301 09:47:06.559723 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3-catalog-content\") pod \"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3\" (UID: \"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3\") " Mar 01 09:47:06 crc kubenswrapper[4792]: I0301 09:47:06.559958 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3-utilities\") pod \"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3\" (UID: \"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3\") " Mar 01 09:47:06 crc kubenswrapper[4792]: I0301 09:47:06.562849 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3-utilities" (OuterVolumeSpecName: "utilities") pod "d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3" (UID: "d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:47:06 crc kubenswrapper[4792]: I0301 09:47:06.569604 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3-kube-api-access-8kp4d" (OuterVolumeSpecName: "kube-api-access-8kp4d") pod "d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3" (UID: "d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3"). InnerVolumeSpecName "kube-api-access-8kp4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:47:06 crc kubenswrapper[4792]: I0301 09:47:06.630570 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3" (UID: "d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:47:06 crc kubenswrapper[4792]: I0301 09:47:06.662104 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kp4d\" (UniqueName: \"kubernetes.io/projected/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3-kube-api-access-8kp4d\") on node \"crc\" DevicePath \"\"" Mar 01 09:47:06 crc kubenswrapper[4792]: I0301 09:47:06.662130 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:47:06 crc kubenswrapper[4792]: I0301 09:47:06.662140 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:47:06 crc kubenswrapper[4792]: I0301 09:47:06.969581 4792 generic.go:334] "Generic (PLEG): container finished" podID="d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3" containerID="ac24232ff9dc9a3289df755cfb01ddcde4ad46d0fb7b6c271ed181ad1e56d80e" exitCode=0 Mar 01 09:47:06 crc kubenswrapper[4792]: I0301 09:47:06.969632 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngjhc" event={"ID":"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3","Type":"ContainerDied","Data":"ac24232ff9dc9a3289df755cfb01ddcde4ad46d0fb7b6c271ed181ad1e56d80e"} Mar 01 09:47:06 crc kubenswrapper[4792]: I0301 09:47:06.969693 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngjhc" event={"ID":"d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3","Type":"ContainerDied","Data":"69f0c5dc0c1619083b7daceb6ad82ab03f1d67532fb7c06ccbeefe11fdc99439"} Mar 01 09:47:06 crc kubenswrapper[4792]: I0301 09:47:06.969714 4792 scope.go:117] "RemoveContainer" containerID="ac24232ff9dc9a3289df755cfb01ddcde4ad46d0fb7b6c271ed181ad1e56d80e" Mar 01 09:47:06 crc kubenswrapper[4792]: I0301 09:47:06.969865 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ngjhc" Mar 01 09:47:07 crc kubenswrapper[4792]: I0301 09:47:06.999228 4792 scope.go:117] "RemoveContainer" containerID="6f64b6331ce23adb7d1f2935446438b352d2e4ffc202dfc8aceabf6b55312303" Mar 01 09:47:07 crc kubenswrapper[4792]: I0301 09:47:07.005284 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ngjhc"] Mar 01 09:47:07 crc kubenswrapper[4792]: I0301 09:47:07.011083 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ngjhc"] Mar 01 09:47:07 crc kubenswrapper[4792]: I0301 09:47:07.020355 4792 scope.go:117] "RemoveContainer" containerID="4d936e88bfe740547ec3ad69b38a532ec388cf67e43e3209ea871881ebe7b0b3" Mar 01 09:47:07 crc kubenswrapper[4792]: I0301 09:47:07.061125 4792 scope.go:117] "RemoveContainer" containerID="ac24232ff9dc9a3289df755cfb01ddcde4ad46d0fb7b6c271ed181ad1e56d80e" Mar 01 09:47:07 crc kubenswrapper[4792]: E0301 09:47:07.061702 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac24232ff9dc9a3289df755cfb01ddcde4ad46d0fb7b6c271ed181ad1e56d80e\": container with ID starting with ac24232ff9dc9a3289df755cfb01ddcde4ad46d0fb7b6c271ed181ad1e56d80e not found: ID does not exist" containerID="ac24232ff9dc9a3289df755cfb01ddcde4ad46d0fb7b6c271ed181ad1e56d80e" Mar 01 09:47:07 crc kubenswrapper[4792]: I0301 09:47:07.061801 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac24232ff9dc9a3289df755cfb01ddcde4ad46d0fb7b6c271ed181ad1e56d80e"} err="failed to get container status \"ac24232ff9dc9a3289df755cfb01ddcde4ad46d0fb7b6c271ed181ad1e56d80e\": rpc error: code = NotFound desc = could not find container \"ac24232ff9dc9a3289df755cfb01ddcde4ad46d0fb7b6c271ed181ad1e56d80e\": container with ID starting with ac24232ff9dc9a3289df755cfb01ddcde4ad46d0fb7b6c271ed181ad1e56d80e not found: ID does not exist" Mar 01 09:47:07 crc kubenswrapper[4792]: I0301 09:47:07.061925 4792 scope.go:117] "RemoveContainer" containerID="6f64b6331ce23adb7d1f2935446438b352d2e4ffc202dfc8aceabf6b55312303" Mar 01 09:47:07 crc kubenswrapper[4792]: E0301 09:47:07.062223 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f64b6331ce23adb7d1f2935446438b352d2e4ffc202dfc8aceabf6b55312303\": container with ID starting with 6f64b6331ce23adb7d1f2935446438b352d2e4ffc202dfc8aceabf6b55312303 not found: ID does not exist" containerID="6f64b6331ce23adb7d1f2935446438b352d2e4ffc202dfc8aceabf6b55312303" Mar 01 09:47:07 crc kubenswrapper[4792]: I0301 09:47:07.062329 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f64b6331ce23adb7d1f2935446438b352d2e4ffc202dfc8aceabf6b55312303"} err="failed to get container status \"6f64b6331ce23adb7d1f2935446438b352d2e4ffc202dfc8aceabf6b55312303\": rpc error: code = NotFound desc = could not find container \"6f64b6331ce23adb7d1f2935446438b352d2e4ffc202dfc8aceabf6b55312303\": container with ID starting with 6f64b6331ce23adb7d1f2935446438b352d2e4ffc202dfc8aceabf6b55312303 not found: ID does not exist" Mar 01 09:47:07 crc kubenswrapper[4792]: I0301 09:47:07.062417 4792 scope.go:117] "RemoveContainer" containerID="4d936e88bfe740547ec3ad69b38a532ec388cf67e43e3209ea871881ebe7b0b3" Mar 01 09:47:07 crc kubenswrapper[4792]: E0301 09:47:07.062787 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d936e88bfe740547ec3ad69b38a532ec388cf67e43e3209ea871881ebe7b0b3\": container with ID starting with 4d936e88bfe740547ec3ad69b38a532ec388cf67e43e3209ea871881ebe7b0b3 not found: ID does not exist" containerID="4d936e88bfe740547ec3ad69b38a532ec388cf67e43e3209ea871881ebe7b0b3" Mar 01 09:47:07 crc kubenswrapper[4792]: I0301 09:47:07.062882 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d936e88bfe740547ec3ad69b38a532ec388cf67e43e3209ea871881ebe7b0b3"} err="failed to get container status \"4d936e88bfe740547ec3ad69b38a532ec388cf67e43e3209ea871881ebe7b0b3\": rpc error: code = NotFound desc = could not find container \"4d936e88bfe740547ec3ad69b38a532ec388cf67e43e3209ea871881ebe7b0b3\": container with ID starting with 4d936e88bfe740547ec3ad69b38a532ec388cf67e43e3209ea871881ebe7b0b3 not found: ID does not exist" Mar 01 09:47:07 crc kubenswrapper[4792]: I0301 09:47:07.419684 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3" path="/var/lib/kubelet/pods/d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3/volumes" Mar 01 09:47:20 crc kubenswrapper[4792]: I0301 09:47:20.079766 4792 generic.go:334] "Generic (PLEG): container finished" podID="1201ca91-41eb-45d0-991d-71883b4014ae" containerID="f9bde77e46db7a7a43711f203d62d437c0037f5981ab90809f1911e681e22c3b" exitCode=0 Mar 01 09:47:20 crc kubenswrapper[4792]: I0301 09:47:20.079861 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" event={"ID":"1201ca91-41eb-45d0-991d-71883b4014ae","Type":"ContainerDied","Data":"f9bde77e46db7a7a43711f203d62d437c0037f5981ab90809f1911e681e22c3b"} Mar 01 09:47:21 crc kubenswrapper[4792]: I0301 09:47:21.518127 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:47:21 crc kubenswrapper[4792]: I0301 09:47:21.646492 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-ceph\") pod \"1201ca91-41eb-45d0-991d-71883b4014ae\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " Mar 01 09:47:21 crc kubenswrapper[4792]: I0301 09:47:21.646546 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-ssh-key-openstack-edpm-ipam\") pod \"1201ca91-41eb-45d0-991d-71883b4014ae\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " Mar 01 09:47:21 crc kubenswrapper[4792]: I0301 09:47:21.646600 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-bootstrap-combined-ca-bundle\") pod \"1201ca91-41eb-45d0-991d-71883b4014ae\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " Mar 01 09:47:21 crc kubenswrapper[4792]: I0301 09:47:21.646630 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vx4hm\" (UniqueName: \"kubernetes.io/projected/1201ca91-41eb-45d0-991d-71883b4014ae-kube-api-access-vx4hm\") pod \"1201ca91-41eb-45d0-991d-71883b4014ae\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " Mar 01 09:47:21 crc kubenswrapper[4792]: I0301 09:47:21.646662 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-inventory\") pod \"1201ca91-41eb-45d0-991d-71883b4014ae\" (UID: \"1201ca91-41eb-45d0-991d-71883b4014ae\") " Mar 01 09:47:21 crc kubenswrapper[4792]: I0301 09:47:21.651272 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "1201ca91-41eb-45d0-991d-71883b4014ae" (UID: "1201ca91-41eb-45d0-991d-71883b4014ae"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:47:21 crc kubenswrapper[4792]: I0301 09:47:21.651430 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-ceph" (OuterVolumeSpecName: "ceph") pod "1201ca91-41eb-45d0-991d-71883b4014ae" (UID: "1201ca91-41eb-45d0-991d-71883b4014ae"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:47:21 crc kubenswrapper[4792]: I0301 09:47:21.652923 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1201ca91-41eb-45d0-991d-71883b4014ae-kube-api-access-vx4hm" (OuterVolumeSpecName: "kube-api-access-vx4hm") pod "1201ca91-41eb-45d0-991d-71883b4014ae" (UID: "1201ca91-41eb-45d0-991d-71883b4014ae"). InnerVolumeSpecName "kube-api-access-vx4hm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:47:21 crc kubenswrapper[4792]: I0301 09:47:21.675090 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1201ca91-41eb-45d0-991d-71883b4014ae" (UID: "1201ca91-41eb-45d0-991d-71883b4014ae"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:47:21 crc kubenswrapper[4792]: I0301 09:47:21.675525 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-inventory" (OuterVolumeSpecName: "inventory") pod "1201ca91-41eb-45d0-991d-71883b4014ae" (UID: "1201ca91-41eb-45d0-991d-71883b4014ae"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:47:21 crc kubenswrapper[4792]: I0301 09:47:21.748585 4792 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:47:21 crc kubenswrapper[4792]: I0301 09:47:21.748614 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vx4hm\" (UniqueName: \"kubernetes.io/projected/1201ca91-41eb-45d0-991d-71883b4014ae-kube-api-access-vx4hm\") on node \"crc\" DevicePath \"\"" Mar 01 09:47:21 crc kubenswrapper[4792]: I0301 09:47:21.748624 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:47:21 crc kubenswrapper[4792]: I0301 09:47:21.748632 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 09:47:21 crc kubenswrapper[4792]: I0301 09:47:21.748641 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1201ca91-41eb-45d0-991d-71883b4014ae-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.097459 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" event={"ID":"1201ca91-41eb-45d0-991d-71883b4014ae","Type":"ContainerDied","Data":"d3a561fc80333203759a7141294eee19a809aea7b68c78b4e20c79a31fc48cc8"} Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.097506 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3a561fc80333203759a7141294eee19a809aea7b68c78b4e20c79a31fc48cc8" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.097557 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.187622 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks"] Mar 01 09:47:22 crc kubenswrapper[4792]: E0301 09:47:22.187968 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3" containerName="registry-server" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.187984 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3" containerName="registry-server" Mar 01 09:47:22 crc kubenswrapper[4792]: E0301 09:47:22.187994 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3" containerName="extract-utilities" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.188002 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3" containerName="extract-utilities" Mar 01 09:47:22 crc kubenswrapper[4792]: E0301 09:47:22.188023 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3" containerName="extract-content" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.188030 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3" containerName="extract-content" Mar 01 09:47:22 crc kubenswrapper[4792]: E0301 09:47:22.188046 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1201ca91-41eb-45d0-991d-71883b4014ae" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.188052 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1201ca91-41eb-45d0-991d-71883b4014ae" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.188209 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0f3ecdd-62b1-4de5-9c1b-1ecad41aadf3" containerName="registry-server" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.188223 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1201ca91-41eb-45d0-991d-71883b4014ae" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.188834 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.193948 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.194195 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.194346 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.194610 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.194673 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.216468 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks"] Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.358059 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f25228f4-912f-408c-a1d6-9279c350b767-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dtgks\" (UID: \"f25228f4-912f-408c-a1d6-9279c350b767\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.358473 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f25228f4-912f-408c-a1d6-9279c350b767-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dtgks\" (UID: \"f25228f4-912f-408c-a1d6-9279c350b767\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.358561 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f25228f4-912f-408c-a1d6-9279c350b767-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dtgks\" (UID: \"f25228f4-912f-408c-a1d6-9279c350b767\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.358613 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lck8\" (UniqueName: \"kubernetes.io/projected/f25228f4-912f-408c-a1d6-9279c350b767-kube-api-access-9lck8\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dtgks\" (UID: \"f25228f4-912f-408c-a1d6-9279c350b767\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.460424 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f25228f4-912f-408c-a1d6-9279c350b767-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dtgks\" (UID: \"f25228f4-912f-408c-a1d6-9279c350b767\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.460513 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lck8\" (UniqueName: \"kubernetes.io/projected/f25228f4-912f-408c-a1d6-9279c350b767-kube-api-access-9lck8\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dtgks\" (UID: \"f25228f4-912f-408c-a1d6-9279c350b767\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.460562 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f25228f4-912f-408c-a1d6-9279c350b767-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dtgks\" (UID: \"f25228f4-912f-408c-a1d6-9279c350b767\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.460655 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f25228f4-912f-408c-a1d6-9279c350b767-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dtgks\" (UID: \"f25228f4-912f-408c-a1d6-9279c350b767\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.474590 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f25228f4-912f-408c-a1d6-9279c350b767-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dtgks\" (UID: \"f25228f4-912f-408c-a1d6-9279c350b767\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.474630 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f25228f4-912f-408c-a1d6-9279c350b767-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dtgks\" (UID: \"f25228f4-912f-408c-a1d6-9279c350b767\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.475345 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f25228f4-912f-408c-a1d6-9279c350b767-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dtgks\" (UID: \"f25228f4-912f-408c-a1d6-9279c350b767\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.515557 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lck8\" (UniqueName: \"kubernetes.io/projected/f25228f4-912f-408c-a1d6-9279c350b767-kube-api-access-9lck8\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dtgks\" (UID: \"f25228f4-912f-408c-a1d6-9279c350b767\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" Mar 01 09:47:22 crc kubenswrapper[4792]: I0301 09:47:22.810408 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" Mar 01 09:47:23 crc kubenswrapper[4792]: I0301 09:47:23.305842 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks"] Mar 01 09:47:23 crc kubenswrapper[4792]: W0301 09:47:23.311213 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf25228f4_912f_408c_a1d6_9279c350b767.slice/crio-f2bebe2a2356f7743d8d689e986f92baffce6ac133e06edc7d2543f477b746c7 WatchSource:0}: Error finding container f2bebe2a2356f7743d8d689e986f92baffce6ac133e06edc7d2543f477b746c7: Status 404 returned error can't find the container with id f2bebe2a2356f7743d8d689e986f92baffce6ac133e06edc7d2543f477b746c7 Mar 01 09:47:24 crc kubenswrapper[4792]: I0301 09:47:24.118800 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" event={"ID":"f25228f4-912f-408c-a1d6-9279c350b767","Type":"ContainerStarted","Data":"19c6ec5339d76a1fab1bfb2d7d9edc634f8fcdd436f9a2e8873cd35a55eb96f9"} Mar 01 09:47:24 crc kubenswrapper[4792]: I0301 09:47:24.119171 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" event={"ID":"f25228f4-912f-408c-a1d6-9279c350b767","Type":"ContainerStarted","Data":"f2bebe2a2356f7743d8d689e986f92baffce6ac133e06edc7d2543f477b746c7"} Mar 01 09:47:24 crc kubenswrapper[4792]: I0301 09:47:24.133250 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" podStartSLOduration=1.580982858 podStartE2EDuration="2.133223s" podCreationTimestamp="2026-03-01 09:47:22 +0000 UTC" firstStartedPulling="2026-03-01 09:47:23.314029967 +0000 UTC m=+2372.555909164" lastFinishedPulling="2026-03-01 09:47:23.866270109 +0000 UTC m=+2373.108149306" observedRunningTime="2026-03-01 09:47:24.132128542 +0000 UTC m=+2373.374007749" watchObservedRunningTime="2026-03-01 09:47:24.133223 +0000 UTC m=+2373.375102197" Mar 01 09:47:50 crc kubenswrapper[4792]: I0301 09:47:50.325058 4792 generic.go:334] "Generic (PLEG): container finished" podID="f25228f4-912f-408c-a1d6-9279c350b767" containerID="19c6ec5339d76a1fab1bfb2d7d9edc634f8fcdd436f9a2e8873cd35a55eb96f9" exitCode=0 Mar 01 09:47:50 crc kubenswrapper[4792]: I0301 09:47:50.325194 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" event={"ID":"f25228f4-912f-408c-a1d6-9279c350b767","Type":"ContainerDied","Data":"19c6ec5339d76a1fab1bfb2d7d9edc634f8fcdd436f9a2e8873cd35a55eb96f9"} Mar 01 09:47:51 crc kubenswrapper[4792]: I0301 09:47:51.966036 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.064368 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f25228f4-912f-408c-a1d6-9279c350b767-ceph\") pod \"f25228f4-912f-408c-a1d6-9279c350b767\" (UID: \"f25228f4-912f-408c-a1d6-9279c350b767\") " Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.064468 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lck8\" (UniqueName: \"kubernetes.io/projected/f25228f4-912f-408c-a1d6-9279c350b767-kube-api-access-9lck8\") pod \"f25228f4-912f-408c-a1d6-9279c350b767\" (UID: \"f25228f4-912f-408c-a1d6-9279c350b767\") " Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.064527 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f25228f4-912f-408c-a1d6-9279c350b767-inventory\") pod \"f25228f4-912f-408c-a1d6-9279c350b767\" (UID: \"f25228f4-912f-408c-a1d6-9279c350b767\") " Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.064549 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f25228f4-912f-408c-a1d6-9279c350b767-ssh-key-openstack-edpm-ipam\") pod \"f25228f4-912f-408c-a1d6-9279c350b767\" (UID: \"f25228f4-912f-408c-a1d6-9279c350b767\") " Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.070771 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f25228f4-912f-408c-a1d6-9279c350b767-kube-api-access-9lck8" (OuterVolumeSpecName: "kube-api-access-9lck8") pod "f25228f4-912f-408c-a1d6-9279c350b767" (UID: "f25228f4-912f-408c-a1d6-9279c350b767"). InnerVolumeSpecName "kube-api-access-9lck8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.073059 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f25228f4-912f-408c-a1d6-9279c350b767-ceph" (OuterVolumeSpecName: "ceph") pod "f25228f4-912f-408c-a1d6-9279c350b767" (UID: "f25228f4-912f-408c-a1d6-9279c350b767"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.091096 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f25228f4-912f-408c-a1d6-9279c350b767-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f25228f4-912f-408c-a1d6-9279c350b767" (UID: "f25228f4-912f-408c-a1d6-9279c350b767"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.092296 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f25228f4-912f-408c-a1d6-9279c350b767-inventory" (OuterVolumeSpecName: "inventory") pod "f25228f4-912f-408c-a1d6-9279c350b767" (UID: "f25228f4-912f-408c-a1d6-9279c350b767"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.166777 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f25228f4-912f-408c-a1d6-9279c350b767-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.166820 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lck8\" (UniqueName: \"kubernetes.io/projected/f25228f4-912f-408c-a1d6-9279c350b767-kube-api-access-9lck8\") on node \"crc\" DevicePath \"\"" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.166832 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f25228f4-912f-408c-a1d6-9279c350b767-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.166842 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f25228f4-912f-408c-a1d6-9279c350b767-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.340834 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" event={"ID":"f25228f4-912f-408c-a1d6-9279c350b767","Type":"ContainerDied","Data":"f2bebe2a2356f7743d8d689e986f92baffce6ac133e06edc7d2543f477b746c7"} Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.340871 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2bebe2a2356f7743d8d689e986f92baffce6ac133e06edc7d2543f477b746c7" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.340950 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dtgks" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.431131 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l"] Mar 01 09:47:52 crc kubenswrapper[4792]: E0301 09:47:52.431471 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f25228f4-912f-408c-a1d6-9279c350b767" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.431483 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f25228f4-912f-408c-a1d6-9279c350b767" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.431651 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f25228f4-912f-408c-a1d6-9279c350b767" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.432303 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.434772 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.435089 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.435322 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.435876 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.438731 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.451249 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l"] Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.578659 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8449\" (UniqueName: \"kubernetes.io/projected/59b987d8-9463-48cb-9651-1e5cb16aa764-kube-api-access-p8449\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-phn2l\" (UID: \"59b987d8-9463-48cb-9651-1e5cb16aa764\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.578796 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59b987d8-9463-48cb-9651-1e5cb16aa764-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-phn2l\" (UID: \"59b987d8-9463-48cb-9651-1e5cb16aa764\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.578838 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59b987d8-9463-48cb-9651-1e5cb16aa764-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-phn2l\" (UID: \"59b987d8-9463-48cb-9651-1e5cb16aa764\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.578870 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/59b987d8-9463-48cb-9651-1e5cb16aa764-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-phn2l\" (UID: \"59b987d8-9463-48cb-9651-1e5cb16aa764\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.680042 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59b987d8-9463-48cb-9651-1e5cb16aa764-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-phn2l\" (UID: \"59b987d8-9463-48cb-9651-1e5cb16aa764\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.680123 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59b987d8-9463-48cb-9651-1e5cb16aa764-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-phn2l\" (UID: \"59b987d8-9463-48cb-9651-1e5cb16aa764\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.680158 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/59b987d8-9463-48cb-9651-1e5cb16aa764-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-phn2l\" (UID: \"59b987d8-9463-48cb-9651-1e5cb16aa764\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.680250 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8449\" (UniqueName: \"kubernetes.io/projected/59b987d8-9463-48cb-9651-1e5cb16aa764-kube-api-access-p8449\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-phn2l\" (UID: \"59b987d8-9463-48cb-9651-1e5cb16aa764\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.690494 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59b987d8-9463-48cb-9651-1e5cb16aa764-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-phn2l\" (UID: \"59b987d8-9463-48cb-9651-1e5cb16aa764\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.691541 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/59b987d8-9463-48cb-9651-1e5cb16aa764-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-phn2l\" (UID: \"59b987d8-9463-48cb-9651-1e5cb16aa764\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.699774 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59b987d8-9463-48cb-9651-1e5cb16aa764-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-phn2l\" (UID: \"59b987d8-9463-48cb-9651-1e5cb16aa764\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.705111 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8449\" (UniqueName: \"kubernetes.io/projected/59b987d8-9463-48cb-9651-1e5cb16aa764-kube-api-access-p8449\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-phn2l\" (UID: \"59b987d8-9463-48cb-9651-1e5cb16aa764\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" Mar 01 09:47:52 crc kubenswrapper[4792]: I0301 09:47:52.744898 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" Mar 01 09:47:53 crc kubenswrapper[4792]: I0301 09:47:53.307515 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l"] Mar 01 09:47:53 crc kubenswrapper[4792]: I0301 09:47:53.348191 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" event={"ID":"59b987d8-9463-48cb-9651-1e5cb16aa764","Type":"ContainerStarted","Data":"0965a313b710b708c5af294657c525d6ca94115e8707a0d5c46e3b088cb75fcc"} Mar 01 09:47:54 crc kubenswrapper[4792]: I0301 09:47:54.357598 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" event={"ID":"59b987d8-9463-48cb-9651-1e5cb16aa764","Type":"ContainerStarted","Data":"29293b309e902e27c992cdd4a10eda8522305bb670dcf4444d1ee4e4d67716f9"} Mar 01 09:47:54 crc kubenswrapper[4792]: I0301 09:47:54.382213 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" podStartSLOduration=1.877013566 podStartE2EDuration="2.382192169s" podCreationTimestamp="2026-03-01 09:47:52 +0000 UTC" firstStartedPulling="2026-03-01 09:47:53.32378724 +0000 UTC m=+2402.565666437" lastFinishedPulling="2026-03-01 09:47:53.828965843 +0000 UTC m=+2403.070845040" observedRunningTime="2026-03-01 09:47:54.380052015 +0000 UTC m=+2403.621931222" watchObservedRunningTime="2026-03-01 09:47:54.382192169 +0000 UTC m=+2403.624071366" Mar 01 09:47:59 crc kubenswrapper[4792]: I0301 09:47:59.399068 4792 generic.go:334] "Generic (PLEG): container finished" podID="59b987d8-9463-48cb-9651-1e5cb16aa764" containerID="29293b309e902e27c992cdd4a10eda8522305bb670dcf4444d1ee4e4d67716f9" exitCode=0 Mar 01 09:47:59 crc kubenswrapper[4792]: I0301 09:47:59.399260 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" event={"ID":"59b987d8-9463-48cb-9651-1e5cb16aa764","Type":"ContainerDied","Data":"29293b309e902e27c992cdd4a10eda8522305bb670dcf4444d1ee4e4d67716f9"} Mar 01 09:48:00 crc kubenswrapper[4792]: I0301 09:48:00.125751 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539308-lnc4n"] Mar 01 09:48:00 crc kubenswrapper[4792]: I0301 09:48:00.126931 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539308-lnc4n" Mar 01 09:48:00 crc kubenswrapper[4792]: I0301 09:48:00.128982 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:48:00 crc kubenswrapper[4792]: I0301 09:48:00.131247 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:48:00 crc kubenswrapper[4792]: I0301 09:48:00.131428 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:48:00 crc kubenswrapper[4792]: I0301 09:48:00.141786 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539308-lnc4n"] Mar 01 09:48:00 crc kubenswrapper[4792]: I0301 09:48:00.248393 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt7t7\" (UniqueName: \"kubernetes.io/projected/1e3f198d-a642-45b3-9a5a-fd5906670db8-kube-api-access-jt7t7\") pod \"auto-csr-approver-29539308-lnc4n\" (UID: \"1e3f198d-a642-45b3-9a5a-fd5906670db8\") " pod="openshift-infra/auto-csr-approver-29539308-lnc4n" Mar 01 09:48:00 crc kubenswrapper[4792]: I0301 09:48:00.350163 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt7t7\" (UniqueName: \"kubernetes.io/projected/1e3f198d-a642-45b3-9a5a-fd5906670db8-kube-api-access-jt7t7\") pod \"auto-csr-approver-29539308-lnc4n\" (UID: \"1e3f198d-a642-45b3-9a5a-fd5906670db8\") " pod="openshift-infra/auto-csr-approver-29539308-lnc4n" Mar 01 09:48:00 crc kubenswrapper[4792]: I0301 09:48:00.368773 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt7t7\" (UniqueName: \"kubernetes.io/projected/1e3f198d-a642-45b3-9a5a-fd5906670db8-kube-api-access-jt7t7\") pod \"auto-csr-approver-29539308-lnc4n\" (UID: \"1e3f198d-a642-45b3-9a5a-fd5906670db8\") " pod="openshift-infra/auto-csr-approver-29539308-lnc4n" Mar 01 09:48:00 crc kubenswrapper[4792]: I0301 09:48:00.453437 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539308-lnc4n" Mar 01 09:48:00 crc kubenswrapper[4792]: I0301 09:48:00.973598 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539308-lnc4n"] Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.009775 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.062866 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59b987d8-9463-48cb-9651-1e5cb16aa764-inventory\") pod \"59b987d8-9463-48cb-9651-1e5cb16aa764\" (UID: \"59b987d8-9463-48cb-9651-1e5cb16aa764\") " Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.063402 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8449\" (UniqueName: \"kubernetes.io/projected/59b987d8-9463-48cb-9651-1e5cb16aa764-kube-api-access-p8449\") pod \"59b987d8-9463-48cb-9651-1e5cb16aa764\" (UID: \"59b987d8-9463-48cb-9651-1e5cb16aa764\") " Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.063496 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59b987d8-9463-48cb-9651-1e5cb16aa764-ssh-key-openstack-edpm-ipam\") pod \"59b987d8-9463-48cb-9651-1e5cb16aa764\" (UID: \"59b987d8-9463-48cb-9651-1e5cb16aa764\") " Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.063520 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/59b987d8-9463-48cb-9651-1e5cb16aa764-ceph\") pod \"59b987d8-9463-48cb-9651-1e5cb16aa764\" (UID: \"59b987d8-9463-48cb-9651-1e5cb16aa764\") " Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.068489 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59b987d8-9463-48cb-9651-1e5cb16aa764-ceph" (OuterVolumeSpecName: "ceph") pod "59b987d8-9463-48cb-9651-1e5cb16aa764" (UID: "59b987d8-9463-48cb-9651-1e5cb16aa764"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.068801 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59b987d8-9463-48cb-9651-1e5cb16aa764-kube-api-access-p8449" (OuterVolumeSpecName: "kube-api-access-p8449") pod "59b987d8-9463-48cb-9651-1e5cb16aa764" (UID: "59b987d8-9463-48cb-9651-1e5cb16aa764"). InnerVolumeSpecName "kube-api-access-p8449". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.092378 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59b987d8-9463-48cb-9651-1e5cb16aa764-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "59b987d8-9463-48cb-9651-1e5cb16aa764" (UID: "59b987d8-9463-48cb-9651-1e5cb16aa764"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.092680 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59b987d8-9463-48cb-9651-1e5cb16aa764-inventory" (OuterVolumeSpecName: "inventory") pod "59b987d8-9463-48cb-9651-1e5cb16aa764" (UID: "59b987d8-9463-48cb-9651-1e5cb16aa764"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.164932 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59b987d8-9463-48cb-9651-1e5cb16aa764-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.164967 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8449\" (UniqueName: \"kubernetes.io/projected/59b987d8-9463-48cb-9651-1e5cb16aa764-kube-api-access-p8449\") on node \"crc\" DevicePath \"\"" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.164979 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59b987d8-9463-48cb-9651-1e5cb16aa764-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.164987 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/59b987d8-9463-48cb-9651-1e5cb16aa764-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.422763 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539308-lnc4n" event={"ID":"1e3f198d-a642-45b3-9a5a-fd5906670db8","Type":"ContainerStarted","Data":"824c90eca7cbf0dcb5df0b863beabbb76571e3954f409dbed59aaeac345a811b"} Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.424158 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" event={"ID":"59b987d8-9463-48cb-9651-1e5cb16aa764","Type":"ContainerDied","Data":"0965a313b710b708c5af294657c525d6ca94115e8707a0d5c46e3b088cb75fcc"} Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.424201 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0965a313b710b708c5af294657c525d6ca94115e8707a0d5c46e3b088cb75fcc" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.424250 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-phn2l" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.496479 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28"] Mar 01 09:48:01 crc kubenswrapper[4792]: E0301 09:48:01.496791 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59b987d8-9463-48cb-9651-1e5cb16aa764" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.496804 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="59b987d8-9463-48cb-9651-1e5cb16aa764" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.496990 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="59b987d8-9463-48cb-9651-1e5cb16aa764" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.497579 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.500737 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.500916 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.501490 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.502033 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.505043 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.541245 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28"] Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.674262 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8g2n\" (UniqueName: \"kubernetes.io/projected/822af429-9091-43e5-a16d-7a287f2c5bb2-kube-api-access-m8g2n\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4rw28\" (UID: \"822af429-9091-43e5-a16d-7a287f2c5bb2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.674356 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4rw28\" (UID: \"822af429-9091-43e5-a16d-7a287f2c5bb2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.674515 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4rw28\" (UID: \"822af429-9091-43e5-a16d-7a287f2c5bb2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.674674 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4rw28\" (UID: \"822af429-9091-43e5-a16d-7a287f2c5bb2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.776679 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4rw28\" (UID: \"822af429-9091-43e5-a16d-7a287f2c5bb2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.776737 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4rw28\" (UID: \"822af429-9091-43e5-a16d-7a287f2c5bb2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.776805 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4rw28\" (UID: \"822af429-9091-43e5-a16d-7a287f2c5bb2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.776960 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8g2n\" (UniqueName: \"kubernetes.io/projected/822af429-9091-43e5-a16d-7a287f2c5bb2-kube-api-access-m8g2n\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4rw28\" (UID: \"822af429-9091-43e5-a16d-7a287f2c5bb2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.781651 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4rw28\" (UID: \"822af429-9091-43e5-a16d-7a287f2c5bb2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.781685 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4rw28\" (UID: \"822af429-9091-43e5-a16d-7a287f2c5bb2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.782461 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4rw28\" (UID: \"822af429-9091-43e5-a16d-7a287f2c5bb2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.805294 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8g2n\" (UniqueName: \"kubernetes.io/projected/822af429-9091-43e5-a16d-7a287f2c5bb2-kube-api-access-m8g2n\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4rw28\" (UID: \"822af429-9091-43e5-a16d-7a287f2c5bb2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" Mar 01 09:48:01 crc kubenswrapper[4792]: I0301 09:48:01.812146 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" Mar 01 09:48:02 crc kubenswrapper[4792]: I0301 09:48:02.361860 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28"] Mar 01 09:48:02 crc kubenswrapper[4792]: W0301 09:48:02.372109 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod822af429_9091_43e5_a16d_7a287f2c5bb2.slice/crio-81b2f396188a63440fcba678466e372145f1b2a7a97016d2c98f9e382bb3bf96 WatchSource:0}: Error finding container 81b2f396188a63440fcba678466e372145f1b2a7a97016d2c98f9e382bb3bf96: Status 404 returned error can't find the container with id 81b2f396188a63440fcba678466e372145f1b2a7a97016d2c98f9e382bb3bf96 Mar 01 09:48:02 crc kubenswrapper[4792]: I0301 09:48:02.431957 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" event={"ID":"822af429-9091-43e5-a16d-7a287f2c5bb2","Type":"ContainerStarted","Data":"81b2f396188a63440fcba678466e372145f1b2a7a97016d2c98f9e382bb3bf96"} Mar 01 09:48:04 crc kubenswrapper[4792]: I0301 09:48:04.450047 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" event={"ID":"822af429-9091-43e5-a16d-7a287f2c5bb2","Type":"ContainerStarted","Data":"97d084e532d6eb8f554aee35118c23f6e5264bb4a9857125f5d1e0995c3a8746"} Mar 01 09:48:04 crc kubenswrapper[4792]: I0301 09:48:04.452189 4792 generic.go:334] "Generic (PLEG): container finished" podID="1e3f198d-a642-45b3-9a5a-fd5906670db8" containerID="cde0b22712c7c2f1430743fdebf0e1e49438b47b056e66c49fd78cf546ba54f9" exitCode=0 Mar 01 09:48:04 crc kubenswrapper[4792]: I0301 09:48:04.452239 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539308-lnc4n" event={"ID":"1e3f198d-a642-45b3-9a5a-fd5906670db8","Type":"ContainerDied","Data":"cde0b22712c7c2f1430743fdebf0e1e49438b47b056e66c49fd78cf546ba54f9"} Mar 01 09:48:04 crc kubenswrapper[4792]: I0301 09:48:04.465372 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" podStartSLOduration=2.284158907 podStartE2EDuration="3.465355819s" podCreationTimestamp="2026-03-01 09:48:01 +0000 UTC" firstStartedPulling="2026-03-01 09:48:02.373952547 +0000 UTC m=+2411.615831744" lastFinishedPulling="2026-03-01 09:48:03.555149459 +0000 UTC m=+2412.797028656" observedRunningTime="2026-03-01 09:48:04.463529803 +0000 UTC m=+2413.705409000" watchObservedRunningTime="2026-03-01 09:48:04.465355819 +0000 UTC m=+2413.707235016" Mar 01 09:48:05 crc kubenswrapper[4792]: I0301 09:48:05.787223 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539308-lnc4n" Mar 01 09:48:05 crc kubenswrapper[4792]: I0301 09:48:05.859355 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt7t7\" (UniqueName: \"kubernetes.io/projected/1e3f198d-a642-45b3-9a5a-fd5906670db8-kube-api-access-jt7t7\") pod \"1e3f198d-a642-45b3-9a5a-fd5906670db8\" (UID: \"1e3f198d-a642-45b3-9a5a-fd5906670db8\") " Mar 01 09:48:05 crc kubenswrapper[4792]: I0301 09:48:05.867174 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e3f198d-a642-45b3-9a5a-fd5906670db8-kube-api-access-jt7t7" (OuterVolumeSpecName: "kube-api-access-jt7t7") pod "1e3f198d-a642-45b3-9a5a-fd5906670db8" (UID: "1e3f198d-a642-45b3-9a5a-fd5906670db8"). InnerVolumeSpecName "kube-api-access-jt7t7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:48:05 crc kubenswrapper[4792]: I0301 09:48:05.962094 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jt7t7\" (UniqueName: \"kubernetes.io/projected/1e3f198d-a642-45b3-9a5a-fd5906670db8-kube-api-access-jt7t7\") on node \"crc\" DevicePath \"\"" Mar 01 09:48:06 crc kubenswrapper[4792]: I0301 09:48:06.468466 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539308-lnc4n" event={"ID":"1e3f198d-a642-45b3-9a5a-fd5906670db8","Type":"ContainerDied","Data":"824c90eca7cbf0dcb5df0b863beabbb76571e3954f409dbed59aaeac345a811b"} Mar 01 09:48:06 crc kubenswrapper[4792]: I0301 09:48:06.468500 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="824c90eca7cbf0dcb5df0b863beabbb76571e3954f409dbed59aaeac345a811b" Mar 01 09:48:06 crc kubenswrapper[4792]: I0301 09:48:06.468527 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539308-lnc4n" Mar 01 09:48:06 crc kubenswrapper[4792]: I0301 09:48:06.855023 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539302-26gpt"] Mar 01 09:48:06 crc kubenswrapper[4792]: I0301 09:48:06.860933 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539302-26gpt"] Mar 01 09:48:07 crc kubenswrapper[4792]: I0301 09:48:07.423016 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5" path="/var/lib/kubelet/pods/792c2a11-8c71-4c9a-b0e8-53ed5d9e77f5/volumes" Mar 01 09:48:31 crc kubenswrapper[4792]: I0301 09:48:31.120295 4792 scope.go:117] "RemoveContainer" containerID="53fe1a8f0f86c9e965b90816a9566427d372fba1cc22db1d2bb0ca2e72f57708" Mar 01 09:48:34 crc kubenswrapper[4792]: I0301 09:48:34.943144 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:48:34 crc kubenswrapper[4792]: I0301 09:48:34.943653 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:48:41 crc kubenswrapper[4792]: I0301 09:48:41.779789 4792 generic.go:334] "Generic (PLEG): container finished" podID="822af429-9091-43e5-a16d-7a287f2c5bb2" containerID="97d084e532d6eb8f554aee35118c23f6e5264bb4a9857125f5d1e0995c3a8746" exitCode=0 Mar 01 09:48:41 crc kubenswrapper[4792]: I0301 09:48:41.779869 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" event={"ID":"822af429-9091-43e5-a16d-7a287f2c5bb2","Type":"ContainerDied","Data":"97d084e532d6eb8f554aee35118c23f6e5264bb4a9857125f5d1e0995c3a8746"} Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.178700 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.249399 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-inventory\") pod \"822af429-9091-43e5-a16d-7a287f2c5bb2\" (UID: \"822af429-9091-43e5-a16d-7a287f2c5bb2\") " Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.249540 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-ssh-key-openstack-edpm-ipam\") pod \"822af429-9091-43e5-a16d-7a287f2c5bb2\" (UID: \"822af429-9091-43e5-a16d-7a287f2c5bb2\") " Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.249583 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8g2n\" (UniqueName: \"kubernetes.io/projected/822af429-9091-43e5-a16d-7a287f2c5bb2-kube-api-access-m8g2n\") pod \"822af429-9091-43e5-a16d-7a287f2c5bb2\" (UID: \"822af429-9091-43e5-a16d-7a287f2c5bb2\") " Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.249608 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-ceph\") pod \"822af429-9091-43e5-a16d-7a287f2c5bb2\" (UID: \"822af429-9091-43e5-a16d-7a287f2c5bb2\") " Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.263605 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/822af429-9091-43e5-a16d-7a287f2c5bb2-kube-api-access-m8g2n" (OuterVolumeSpecName: "kube-api-access-m8g2n") pod "822af429-9091-43e5-a16d-7a287f2c5bb2" (UID: "822af429-9091-43e5-a16d-7a287f2c5bb2"). InnerVolumeSpecName "kube-api-access-m8g2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.271867 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-ceph" (OuterVolumeSpecName: "ceph") pod "822af429-9091-43e5-a16d-7a287f2c5bb2" (UID: "822af429-9091-43e5-a16d-7a287f2c5bb2"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:48:43 crc kubenswrapper[4792]: E0301 09:48:43.296669 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-inventory podName:822af429-9091-43e5-a16d-7a287f2c5bb2 nodeName:}" failed. No retries permitted until 2026-03-01 09:48:43.796641405 +0000 UTC m=+2453.038520602 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-inventory") pod "822af429-9091-43e5-a16d-7a287f2c5bb2" (UID: "822af429-9091-43e5-a16d-7a287f2c5bb2") : error deleting /var/lib/kubelet/pods/822af429-9091-43e5-a16d-7a287f2c5bb2/volume-subpaths: remove /var/lib/kubelet/pods/822af429-9091-43e5-a16d-7a287f2c5bb2/volume-subpaths: no such file or directory Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.299874 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "822af429-9091-43e5-a16d-7a287f2c5bb2" (UID: "822af429-9091-43e5-a16d-7a287f2c5bb2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.351962 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.351994 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8g2n\" (UniqueName: \"kubernetes.io/projected/822af429-9091-43e5-a16d-7a287f2c5bb2-kube-api-access-m8g2n\") on node \"crc\" DevicePath \"\"" Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.352003 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.798268 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" event={"ID":"822af429-9091-43e5-a16d-7a287f2c5bb2","Type":"ContainerDied","Data":"81b2f396188a63440fcba678466e372145f1b2a7a97016d2c98f9e382bb3bf96"} Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.798305 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81b2f396188a63440fcba678466e372145f1b2a7a97016d2c98f9e382bb3bf96" Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.798721 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4rw28" Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.861739 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-inventory\") pod \"822af429-9091-43e5-a16d-7a287f2c5bb2\" (UID: \"822af429-9091-43e5-a16d-7a287f2c5bb2\") " Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.868106 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-inventory" (OuterVolumeSpecName: "inventory") pod "822af429-9091-43e5-a16d-7a287f2c5bb2" (UID: "822af429-9091-43e5-a16d-7a287f2c5bb2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.906031 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m"] Mar 01 09:48:43 crc kubenswrapper[4792]: E0301 09:48:43.906398 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="822af429-9091-43e5-a16d-7a287f2c5bb2" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.906419 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="822af429-9091-43e5-a16d-7a287f2c5bb2" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:48:43 crc kubenswrapper[4792]: E0301 09:48:43.906442 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e3f198d-a642-45b3-9a5a-fd5906670db8" containerName="oc" Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.906449 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e3f198d-a642-45b3-9a5a-fd5906670db8" containerName="oc" Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.906610 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e3f198d-a642-45b3-9a5a-fd5906670db8" containerName="oc" Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.906625 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="822af429-9091-43e5-a16d-7a287f2c5bb2" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.907184 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.964334 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m"] Mar 01 09:48:43 crc kubenswrapper[4792]: I0301 09:48:43.966177 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/822af429-9091-43e5-a16d-7a287f2c5bb2-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:48:44 crc kubenswrapper[4792]: I0301 09:48:44.068183 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m\" (UID: \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" Mar 01 09:48:44 crc kubenswrapper[4792]: I0301 09:48:44.068447 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m\" (UID: \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" Mar 01 09:48:44 crc kubenswrapper[4792]: I0301 09:48:44.068649 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mktvs\" (UniqueName: \"kubernetes.io/projected/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-kube-api-access-mktvs\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m\" (UID: \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" Mar 01 09:48:44 crc kubenswrapper[4792]: I0301 09:48:44.068727 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m\" (UID: \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" Mar 01 09:48:44 crc kubenswrapper[4792]: I0301 09:48:44.170043 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m\" (UID: \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" Mar 01 09:48:44 crc kubenswrapper[4792]: I0301 09:48:44.170148 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mktvs\" (UniqueName: \"kubernetes.io/projected/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-kube-api-access-mktvs\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m\" (UID: \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" Mar 01 09:48:44 crc kubenswrapper[4792]: I0301 09:48:44.170198 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m\" (UID: \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" Mar 01 09:48:44 crc kubenswrapper[4792]: I0301 09:48:44.170249 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m\" (UID: \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" Mar 01 09:48:44 crc kubenswrapper[4792]: I0301 09:48:44.173224 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m\" (UID: \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" Mar 01 09:48:44 crc kubenswrapper[4792]: I0301 09:48:44.173640 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m\" (UID: \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" Mar 01 09:48:44 crc kubenswrapper[4792]: I0301 09:48:44.174556 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m\" (UID: \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" Mar 01 09:48:44 crc kubenswrapper[4792]: I0301 09:48:44.191091 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mktvs\" (UniqueName: \"kubernetes.io/projected/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-kube-api-access-mktvs\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m\" (UID: \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" Mar 01 09:48:44 crc kubenswrapper[4792]: I0301 09:48:44.243419 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" Mar 01 09:48:44 crc kubenswrapper[4792]: I0301 09:48:44.790819 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m"] Mar 01 09:48:44 crc kubenswrapper[4792]: I0301 09:48:44.811195 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" event={"ID":"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37","Type":"ContainerStarted","Data":"dc03f6c62bcf64863e46028ea3bb7dc4615b60cbd65d489e3c6a07cbcfb6540c"} Mar 01 09:48:45 crc kubenswrapper[4792]: I0301 09:48:45.821338 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" event={"ID":"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37","Type":"ContainerStarted","Data":"85c9f21baa52c98be995a258e8b085fb46979f0ec9568bd0eb472bbf230fefcc"} Mar 01 09:48:45 crc kubenswrapper[4792]: I0301 09:48:45.836107 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" podStartSLOduration=2.212627227 podStartE2EDuration="2.836089179s" podCreationTimestamp="2026-03-01 09:48:43 +0000 UTC" firstStartedPulling="2026-03-01 09:48:44.801619651 +0000 UTC m=+2454.043498858" lastFinishedPulling="2026-03-01 09:48:45.425081593 +0000 UTC m=+2454.666960810" observedRunningTime="2026-03-01 09:48:45.835693109 +0000 UTC m=+2455.077572306" watchObservedRunningTime="2026-03-01 09:48:45.836089179 +0000 UTC m=+2455.077968376" Mar 01 09:48:49 crc kubenswrapper[4792]: I0301 09:48:49.853798 4792 generic.go:334] "Generic (PLEG): container finished" podID="2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37" containerID="85c9f21baa52c98be995a258e8b085fb46979f0ec9568bd0eb472bbf230fefcc" exitCode=0 Mar 01 09:48:49 crc kubenswrapper[4792]: I0301 09:48:49.853883 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" event={"ID":"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37","Type":"ContainerDied","Data":"85c9f21baa52c98be995a258e8b085fb46979f0ec9568bd0eb472bbf230fefcc"} Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.265137 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.320230 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mktvs\" (UniqueName: \"kubernetes.io/projected/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-kube-api-access-mktvs\") pod \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\" (UID: \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\") " Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.320299 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-ceph\") pod \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\" (UID: \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\") " Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.320368 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-ssh-key-openstack-edpm-ipam\") pod \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\" (UID: \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\") " Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.320405 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-inventory\") pod \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\" (UID: \"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37\") " Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.330545 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-kube-api-access-mktvs" (OuterVolumeSpecName: "kube-api-access-mktvs") pod "2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37" (UID: "2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37"). InnerVolumeSpecName "kube-api-access-mktvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.330643 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-ceph" (OuterVolumeSpecName: "ceph") pod "2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37" (UID: "2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.353562 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37" (UID: "2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.355750 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-inventory" (OuterVolumeSpecName: "inventory") pod "2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37" (UID: "2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.422487 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mktvs\" (UniqueName: \"kubernetes.io/projected/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-kube-api-access-mktvs\") on node \"crc\" DevicePath \"\"" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.422846 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.422861 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.422878 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.872060 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" event={"ID":"2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37","Type":"ContainerDied","Data":"dc03f6c62bcf64863e46028ea3bb7dc4615b60cbd65d489e3c6a07cbcfb6540c"} Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.872100 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc03f6c62bcf64863e46028ea3bb7dc4615b60cbd65d489e3c6a07cbcfb6540c" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.872166 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.937487 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5"] Mar 01 09:48:51 crc kubenswrapper[4792]: E0301 09:48:51.937922 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.937942 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.938179 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.938888 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.941884 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.942076 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.942173 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.942227 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.943881 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 01 09:48:51 crc kubenswrapper[4792]: I0301 09:48:51.951723 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5"] Mar 01 09:48:52 crc kubenswrapper[4792]: I0301 09:48:52.038359 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5\" (UID: \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" Mar 01 09:48:52 crc kubenswrapper[4792]: I0301 09:48:52.038409 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdcf4\" (UniqueName: \"kubernetes.io/projected/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-kube-api-access-pdcf4\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5\" (UID: \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" Mar 01 09:48:52 crc kubenswrapper[4792]: I0301 09:48:52.038511 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5\" (UID: \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" Mar 01 09:48:52 crc kubenswrapper[4792]: I0301 09:48:52.038577 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5\" (UID: \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" Mar 01 09:48:52 crc kubenswrapper[4792]: I0301 09:48:52.140401 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5\" (UID: \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" Mar 01 09:48:52 crc kubenswrapper[4792]: I0301 09:48:52.140478 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5\" (UID: \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" Mar 01 09:48:52 crc kubenswrapper[4792]: I0301 09:48:52.140549 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5\" (UID: \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" Mar 01 09:48:52 crc kubenswrapper[4792]: I0301 09:48:52.140577 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdcf4\" (UniqueName: \"kubernetes.io/projected/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-kube-api-access-pdcf4\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5\" (UID: \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" Mar 01 09:48:52 crc kubenswrapper[4792]: I0301 09:48:52.145298 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5\" (UID: \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" Mar 01 09:48:52 crc kubenswrapper[4792]: I0301 09:48:52.146102 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5\" (UID: \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" Mar 01 09:48:52 crc kubenswrapper[4792]: I0301 09:48:52.149809 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5\" (UID: \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" Mar 01 09:48:52 crc kubenswrapper[4792]: I0301 09:48:52.165287 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdcf4\" (UniqueName: \"kubernetes.io/projected/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-kube-api-access-pdcf4\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5\" (UID: \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" Mar 01 09:48:52 crc kubenswrapper[4792]: I0301 09:48:52.258219 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" Mar 01 09:48:52 crc kubenswrapper[4792]: I0301 09:48:52.764444 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5"] Mar 01 09:48:52 crc kubenswrapper[4792]: I0301 09:48:52.882981 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" event={"ID":"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0","Type":"ContainerStarted","Data":"5991a571e746a511019bcebff55e511b0b510410f4c7048f3a390aaf57e77022"} Mar 01 09:48:53 crc kubenswrapper[4792]: I0301 09:48:53.906795 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" event={"ID":"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0","Type":"ContainerStarted","Data":"3b3b94d508283fee55d1620f5691acc50b6a3016f8452ef7d9860b67978a91f7"} Mar 01 09:48:53 crc kubenswrapper[4792]: I0301 09:48:53.931425 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" podStartSLOduration=2.403768947 podStartE2EDuration="2.931409208s" podCreationTimestamp="2026-03-01 09:48:51 +0000 UTC" firstStartedPulling="2026-03-01 09:48:52.767132343 +0000 UTC m=+2462.009011540" lastFinishedPulling="2026-03-01 09:48:53.294772604 +0000 UTC m=+2462.536651801" observedRunningTime="2026-03-01 09:48:53.922893767 +0000 UTC m=+2463.164772964" watchObservedRunningTime="2026-03-01 09:48:53.931409208 +0000 UTC m=+2463.173288405" Mar 01 09:49:04 crc kubenswrapper[4792]: I0301 09:49:04.943648 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:49:04 crc kubenswrapper[4792]: I0301 09:49:04.944196 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:49:34 crc kubenswrapper[4792]: I0301 09:49:34.942657 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:49:34 crc kubenswrapper[4792]: I0301 09:49:34.944259 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:49:34 crc kubenswrapper[4792]: I0301 09:49:34.944408 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:49:34 crc kubenswrapper[4792]: I0301 09:49:34.945143 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983"} pod="openshift-machine-config-operator/machine-config-daemon-bqszv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 01 09:49:34 crc kubenswrapper[4792]: I0301 09:49:34.945263 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" containerID="cri-o://f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" gracePeriod=600 Mar 01 09:49:35 crc kubenswrapper[4792]: E0301 09:49:35.082551 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:49:35 crc kubenswrapper[4792]: I0301 09:49:35.235766 4792 generic.go:334] "Generic (PLEG): container finished" podID="9105f6b0-6f16-47aa-8009-73736a90b765" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" exitCode=0 Mar 01 09:49:35 crc kubenswrapper[4792]: I0301 09:49:35.235808 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerDied","Data":"f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983"} Mar 01 09:49:35 crc kubenswrapper[4792]: I0301 09:49:35.235839 4792 scope.go:117] "RemoveContainer" containerID="80f7f8ed75933ce29e4ba1caec37158448f83396fe1f5a8bba0233aec8df1ec7" Mar 01 09:49:35 crc kubenswrapper[4792]: I0301 09:49:35.236437 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:49:35 crc kubenswrapper[4792]: E0301 09:49:35.236662 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:49:39 crc kubenswrapper[4792]: I0301 09:49:39.269442 4792 generic.go:334] "Generic (PLEG): container finished" podID="cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0" containerID="3b3b94d508283fee55d1620f5691acc50b6a3016f8452ef7d9860b67978a91f7" exitCode=0 Mar 01 09:49:39 crc kubenswrapper[4792]: I0301 09:49:39.269519 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" event={"ID":"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0","Type":"ContainerDied","Data":"3b3b94d508283fee55d1620f5691acc50b6a3016f8452ef7d9860b67978a91f7"} Mar 01 09:49:40 crc kubenswrapper[4792]: I0301 09:49:40.615000 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" Mar 01 09:49:40 crc kubenswrapper[4792]: I0301 09:49:40.774685 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdcf4\" (UniqueName: \"kubernetes.io/projected/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-kube-api-access-pdcf4\") pod \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\" (UID: \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\") " Mar 01 09:49:40 crc kubenswrapper[4792]: I0301 09:49:40.774813 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-inventory\") pod \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\" (UID: \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\") " Mar 01 09:49:40 crc kubenswrapper[4792]: I0301 09:49:40.774828 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-ceph\") pod \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\" (UID: \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\") " Mar 01 09:49:40 crc kubenswrapper[4792]: I0301 09:49:40.774845 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-ssh-key-openstack-edpm-ipam\") pod \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\" (UID: \"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0\") " Mar 01 09:49:40 crc kubenswrapper[4792]: I0301 09:49:40.780325 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-kube-api-access-pdcf4" (OuterVolumeSpecName: "kube-api-access-pdcf4") pod "cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0" (UID: "cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0"). InnerVolumeSpecName "kube-api-access-pdcf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:49:40 crc kubenswrapper[4792]: I0301 09:49:40.785075 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-ceph" (OuterVolumeSpecName: "ceph") pod "cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0" (UID: "cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:49:40 crc kubenswrapper[4792]: I0301 09:49:40.804439 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-inventory" (OuterVolumeSpecName: "inventory") pod "cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0" (UID: "cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:49:40 crc kubenswrapper[4792]: I0301 09:49:40.812185 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0" (UID: "cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:49:40 crc kubenswrapper[4792]: I0301 09:49:40.877012 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdcf4\" (UniqueName: \"kubernetes.io/projected/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-kube-api-access-pdcf4\") on node \"crc\" DevicePath \"\"" Mar 01 09:49:40 crc kubenswrapper[4792]: I0301 09:49:40.877047 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:49:40 crc kubenswrapper[4792]: I0301 09:49:40.877056 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 09:49:40 crc kubenswrapper[4792]: I0301 09:49:40.877065 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.286490 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" event={"ID":"cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0","Type":"ContainerDied","Data":"5991a571e746a511019bcebff55e511b0b510410f4c7048f3a390aaf57e77022"} Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.286748 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5991a571e746a511019bcebff55e511b0b510410f4c7048f3a390aaf57e77022" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.286555 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.391209 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-8k5rj"] Mar 01 09:49:41 crc kubenswrapper[4792]: E0301 09:49:41.391694 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.391716 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.391922 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.392660 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.397481 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.397721 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.398026 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.398159 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.398663 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.405446 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-8k5rj"] Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.589621 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac58ff00-ba74-492a-97f1-e72c56686f1d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-8k5rj\" (UID: \"ac58ff00-ba74-492a-97f1-e72c56686f1d\") " pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.589740 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ac58ff00-ba74-492a-97f1-e72c56686f1d-ceph\") pod \"ssh-known-hosts-edpm-deployment-8k5rj\" (UID: \"ac58ff00-ba74-492a-97f1-e72c56686f1d\") " pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.589887 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ac58ff00-ba74-492a-97f1-e72c56686f1d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-8k5rj\" (UID: \"ac58ff00-ba74-492a-97f1-e72c56686f1d\") " pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.590002 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99vt8\" (UniqueName: \"kubernetes.io/projected/ac58ff00-ba74-492a-97f1-e72c56686f1d-kube-api-access-99vt8\") pod \"ssh-known-hosts-edpm-deployment-8k5rj\" (UID: \"ac58ff00-ba74-492a-97f1-e72c56686f1d\") " pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.691054 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ac58ff00-ba74-492a-97f1-e72c56686f1d-ceph\") pod \"ssh-known-hosts-edpm-deployment-8k5rj\" (UID: \"ac58ff00-ba74-492a-97f1-e72c56686f1d\") " pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.691152 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ac58ff00-ba74-492a-97f1-e72c56686f1d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-8k5rj\" (UID: \"ac58ff00-ba74-492a-97f1-e72c56686f1d\") " pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.691213 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99vt8\" (UniqueName: \"kubernetes.io/projected/ac58ff00-ba74-492a-97f1-e72c56686f1d-kube-api-access-99vt8\") pod \"ssh-known-hosts-edpm-deployment-8k5rj\" (UID: \"ac58ff00-ba74-492a-97f1-e72c56686f1d\") " pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.691249 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac58ff00-ba74-492a-97f1-e72c56686f1d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-8k5rj\" (UID: \"ac58ff00-ba74-492a-97f1-e72c56686f1d\") " pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.695239 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ac58ff00-ba74-492a-97f1-e72c56686f1d-ceph\") pod \"ssh-known-hosts-edpm-deployment-8k5rj\" (UID: \"ac58ff00-ba74-492a-97f1-e72c56686f1d\") " pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.697106 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac58ff00-ba74-492a-97f1-e72c56686f1d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-8k5rj\" (UID: \"ac58ff00-ba74-492a-97f1-e72c56686f1d\") " pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.700225 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ac58ff00-ba74-492a-97f1-e72c56686f1d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-8k5rj\" (UID: \"ac58ff00-ba74-492a-97f1-e72c56686f1d\") " pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" Mar 01 09:49:41 crc kubenswrapper[4792]: I0301 09:49:41.711662 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99vt8\" (UniqueName: \"kubernetes.io/projected/ac58ff00-ba74-492a-97f1-e72c56686f1d-kube-api-access-99vt8\") pod \"ssh-known-hosts-edpm-deployment-8k5rj\" (UID: \"ac58ff00-ba74-492a-97f1-e72c56686f1d\") " pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" Mar 01 09:49:42 crc kubenswrapper[4792]: I0301 09:49:42.007255 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" Mar 01 09:49:42 crc kubenswrapper[4792]: W0301 09:49:42.530638 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac58ff00_ba74_492a_97f1_e72c56686f1d.slice/crio-e76f2a47a32ae325d9f18bf7b6fb8564c6dbb5ff6d9d975c9bbd12ba32035f88 WatchSource:0}: Error finding container e76f2a47a32ae325d9f18bf7b6fb8564c6dbb5ff6d9d975c9bbd12ba32035f88: Status 404 returned error can't find the container with id e76f2a47a32ae325d9f18bf7b6fb8564c6dbb5ff6d9d975c9bbd12ba32035f88 Mar 01 09:49:42 crc kubenswrapper[4792]: I0301 09:49:42.539285 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-8k5rj"] Mar 01 09:49:43 crc kubenswrapper[4792]: I0301 09:49:43.301964 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" event={"ID":"ac58ff00-ba74-492a-97f1-e72c56686f1d","Type":"ContainerStarted","Data":"baae86ba8afecd3bca5e5d9015365aee4c985262e904999ecd39f2c9a6bda3b5"} Mar 01 09:49:43 crc kubenswrapper[4792]: I0301 09:49:43.302404 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" event={"ID":"ac58ff00-ba74-492a-97f1-e72c56686f1d","Type":"ContainerStarted","Data":"e76f2a47a32ae325d9f18bf7b6fb8564c6dbb5ff6d9d975c9bbd12ba32035f88"} Mar 01 09:49:43 crc kubenswrapper[4792]: I0301 09:49:43.327624 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" podStartSLOduration=1.886662719 podStartE2EDuration="2.327608231s" podCreationTimestamp="2026-03-01 09:49:41 +0000 UTC" firstStartedPulling="2026-03-01 09:49:42.532697844 +0000 UTC m=+2511.774577041" lastFinishedPulling="2026-03-01 09:49:42.973643356 +0000 UTC m=+2512.215522553" observedRunningTime="2026-03-01 09:49:43.323960181 +0000 UTC m=+2512.565839388" watchObservedRunningTime="2026-03-01 09:49:43.327608231 +0000 UTC m=+2512.569487428" Mar 01 09:49:48 crc kubenswrapper[4792]: I0301 09:49:48.409088 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:49:48 crc kubenswrapper[4792]: E0301 09:49:48.410173 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:49:52 crc kubenswrapper[4792]: I0301 09:49:52.385701 4792 generic.go:334] "Generic (PLEG): container finished" podID="ac58ff00-ba74-492a-97f1-e72c56686f1d" containerID="baae86ba8afecd3bca5e5d9015365aee4c985262e904999ecd39f2c9a6bda3b5" exitCode=0 Mar 01 09:49:52 crc kubenswrapper[4792]: I0301 09:49:52.385799 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" event={"ID":"ac58ff00-ba74-492a-97f1-e72c56686f1d","Type":"ContainerDied","Data":"baae86ba8afecd3bca5e5d9015365aee4c985262e904999ecd39f2c9a6bda3b5"} Mar 01 09:49:53 crc kubenswrapper[4792]: I0301 09:49:53.763631 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" Mar 01 09:49:53 crc kubenswrapper[4792]: I0301 09:49:53.820000 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac58ff00-ba74-492a-97f1-e72c56686f1d-ssh-key-openstack-edpm-ipam\") pod \"ac58ff00-ba74-492a-97f1-e72c56686f1d\" (UID: \"ac58ff00-ba74-492a-97f1-e72c56686f1d\") " Mar 01 09:49:53 crc kubenswrapper[4792]: I0301 09:49:53.820556 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ac58ff00-ba74-492a-97f1-e72c56686f1d-inventory-0\") pod \"ac58ff00-ba74-492a-97f1-e72c56686f1d\" (UID: \"ac58ff00-ba74-492a-97f1-e72c56686f1d\") " Mar 01 09:49:53 crc kubenswrapper[4792]: I0301 09:49:53.820660 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ac58ff00-ba74-492a-97f1-e72c56686f1d-ceph\") pod \"ac58ff00-ba74-492a-97f1-e72c56686f1d\" (UID: \"ac58ff00-ba74-492a-97f1-e72c56686f1d\") " Mar 01 09:49:53 crc kubenswrapper[4792]: I0301 09:49:53.820799 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99vt8\" (UniqueName: \"kubernetes.io/projected/ac58ff00-ba74-492a-97f1-e72c56686f1d-kube-api-access-99vt8\") pod \"ac58ff00-ba74-492a-97f1-e72c56686f1d\" (UID: \"ac58ff00-ba74-492a-97f1-e72c56686f1d\") " Mar 01 09:49:53 crc kubenswrapper[4792]: I0301 09:49:53.828138 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac58ff00-ba74-492a-97f1-e72c56686f1d-kube-api-access-99vt8" (OuterVolumeSpecName: "kube-api-access-99vt8") pod "ac58ff00-ba74-492a-97f1-e72c56686f1d" (UID: "ac58ff00-ba74-492a-97f1-e72c56686f1d"). InnerVolumeSpecName "kube-api-access-99vt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:49:53 crc kubenswrapper[4792]: I0301 09:49:53.829086 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac58ff00-ba74-492a-97f1-e72c56686f1d-ceph" (OuterVolumeSpecName: "ceph") pod "ac58ff00-ba74-492a-97f1-e72c56686f1d" (UID: "ac58ff00-ba74-492a-97f1-e72c56686f1d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:49:53 crc kubenswrapper[4792]: I0301 09:49:53.844974 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac58ff00-ba74-492a-97f1-e72c56686f1d-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "ac58ff00-ba74-492a-97f1-e72c56686f1d" (UID: "ac58ff00-ba74-492a-97f1-e72c56686f1d"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:49:53 crc kubenswrapper[4792]: I0301 09:49:53.846389 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac58ff00-ba74-492a-97f1-e72c56686f1d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ac58ff00-ba74-492a-97f1-e72c56686f1d" (UID: "ac58ff00-ba74-492a-97f1-e72c56686f1d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:49:53 crc kubenswrapper[4792]: I0301 09:49:53.922871 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac58ff00-ba74-492a-97f1-e72c56686f1d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:49:53 crc kubenswrapper[4792]: I0301 09:49:53.922954 4792 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ac58ff00-ba74-492a-97f1-e72c56686f1d-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 01 09:49:53 crc kubenswrapper[4792]: I0301 09:49:53.922968 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ac58ff00-ba74-492a-97f1-e72c56686f1d-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 09:49:53 crc kubenswrapper[4792]: I0301 09:49:53.922978 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99vt8\" (UniqueName: \"kubernetes.io/projected/ac58ff00-ba74-492a-97f1-e72c56686f1d-kube-api-access-99vt8\") on node \"crc\" DevicePath \"\"" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.414720 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" event={"ID":"ac58ff00-ba74-492a-97f1-e72c56686f1d","Type":"ContainerDied","Data":"e76f2a47a32ae325d9f18bf7b6fb8564c6dbb5ff6d9d975c9bbd12ba32035f88"} Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.414760 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e76f2a47a32ae325d9f18bf7b6fb8564c6dbb5ff6d9d975c9bbd12ba32035f88" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.414884 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-8k5rj" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.493266 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7"] Mar 01 09:49:54 crc kubenswrapper[4792]: E0301 09:49:54.493681 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac58ff00-ba74-492a-97f1-e72c56686f1d" containerName="ssh-known-hosts-edpm-deployment" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.493703 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac58ff00-ba74-492a-97f1-e72c56686f1d" containerName="ssh-known-hosts-edpm-deployment" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.493883 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac58ff00-ba74-492a-97f1-e72c56686f1d" containerName="ssh-known-hosts-edpm-deployment" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.494601 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.499563 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.499779 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.500777 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.507864 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.508477 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7"] Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.552066 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.552645 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff733b23-0a97-4623-9eeb-339aa02fc3b0-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gxxr7\" (UID: \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.552746 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff733b23-0a97-4623-9eeb-339aa02fc3b0-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gxxr7\" (UID: \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.552793 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ff733b23-0a97-4623-9eeb-339aa02fc3b0-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gxxr7\" (UID: \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.552821 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxwwx\" (UniqueName: \"kubernetes.io/projected/ff733b23-0a97-4623-9eeb-339aa02fc3b0-kube-api-access-cxwwx\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gxxr7\" (UID: \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.654112 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ff733b23-0a97-4623-9eeb-339aa02fc3b0-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gxxr7\" (UID: \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.654582 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxwwx\" (UniqueName: \"kubernetes.io/projected/ff733b23-0a97-4623-9eeb-339aa02fc3b0-kube-api-access-cxwwx\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gxxr7\" (UID: \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.654683 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff733b23-0a97-4623-9eeb-339aa02fc3b0-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gxxr7\" (UID: \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.654746 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff733b23-0a97-4623-9eeb-339aa02fc3b0-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gxxr7\" (UID: \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.661403 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ff733b23-0a97-4623-9eeb-339aa02fc3b0-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gxxr7\" (UID: \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.667440 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff733b23-0a97-4623-9eeb-339aa02fc3b0-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gxxr7\" (UID: \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.670811 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff733b23-0a97-4623-9eeb-339aa02fc3b0-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gxxr7\" (UID: \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.671403 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxwwx\" (UniqueName: \"kubernetes.io/projected/ff733b23-0a97-4623-9eeb-339aa02fc3b0-kube-api-access-cxwwx\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gxxr7\" (UID: \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" Mar 01 09:49:54 crc kubenswrapper[4792]: I0301 09:49:54.871179 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" Mar 01 09:49:55 crc kubenswrapper[4792]: I0301 09:49:55.364226 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7"] Mar 01 09:49:55 crc kubenswrapper[4792]: I0301 09:49:55.421593 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" event={"ID":"ff733b23-0a97-4623-9eeb-339aa02fc3b0","Type":"ContainerStarted","Data":"db6cb5df12916c3876aa7d66ae1518f0b04d2d2d57926fff5a1fa1d1f6b4ca19"} Mar 01 09:49:56 crc kubenswrapper[4792]: I0301 09:49:56.430207 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" event={"ID":"ff733b23-0a97-4623-9eeb-339aa02fc3b0","Type":"ContainerStarted","Data":"b3542fb65ba1732d969d1c1e12d706e23a7dab68ee9d6400fa7b59bdcc1e00eb"} Mar 01 09:49:56 crc kubenswrapper[4792]: I0301 09:49:56.447469 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" podStartSLOduration=2.004931616 podStartE2EDuration="2.447448117s" podCreationTimestamp="2026-03-01 09:49:54 +0000 UTC" firstStartedPulling="2026-03-01 09:49:55.371842471 +0000 UTC m=+2524.613721668" lastFinishedPulling="2026-03-01 09:49:55.814358962 +0000 UTC m=+2525.056238169" observedRunningTime="2026-03-01 09:49:56.442857593 +0000 UTC m=+2525.684736810" watchObservedRunningTime="2026-03-01 09:49:56.447448117 +0000 UTC m=+2525.689327314" Mar 01 09:50:00 crc kubenswrapper[4792]: I0301 09:50:00.128741 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539310-cr6qh"] Mar 01 09:50:00 crc kubenswrapper[4792]: I0301 09:50:00.130348 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539310-cr6qh" Mar 01 09:50:00 crc kubenswrapper[4792]: I0301 09:50:00.132633 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:50:00 crc kubenswrapper[4792]: I0301 09:50:00.132797 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:50:00 crc kubenswrapper[4792]: I0301 09:50:00.133191 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:50:00 crc kubenswrapper[4792]: I0301 09:50:00.145738 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539310-cr6qh"] Mar 01 09:50:00 crc kubenswrapper[4792]: I0301 09:50:00.152041 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-482sm\" (UniqueName: \"kubernetes.io/projected/bf147424-57ff-455c-9aac-e32adcab851e-kube-api-access-482sm\") pod \"auto-csr-approver-29539310-cr6qh\" (UID: \"bf147424-57ff-455c-9aac-e32adcab851e\") " pod="openshift-infra/auto-csr-approver-29539310-cr6qh" Mar 01 09:50:00 crc kubenswrapper[4792]: I0301 09:50:00.254195 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-482sm\" (UniqueName: \"kubernetes.io/projected/bf147424-57ff-455c-9aac-e32adcab851e-kube-api-access-482sm\") pod \"auto-csr-approver-29539310-cr6qh\" (UID: \"bf147424-57ff-455c-9aac-e32adcab851e\") " pod="openshift-infra/auto-csr-approver-29539310-cr6qh" Mar 01 09:50:00 crc kubenswrapper[4792]: I0301 09:50:00.275394 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-482sm\" (UniqueName: \"kubernetes.io/projected/bf147424-57ff-455c-9aac-e32adcab851e-kube-api-access-482sm\") pod \"auto-csr-approver-29539310-cr6qh\" (UID: \"bf147424-57ff-455c-9aac-e32adcab851e\") " pod="openshift-infra/auto-csr-approver-29539310-cr6qh" Mar 01 09:50:00 crc kubenswrapper[4792]: I0301 09:50:00.457460 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539310-cr6qh" Mar 01 09:50:00 crc kubenswrapper[4792]: I0301 09:50:00.862260 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539310-cr6qh"] Mar 01 09:50:01 crc kubenswrapper[4792]: I0301 09:50:01.482563 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539310-cr6qh" event={"ID":"bf147424-57ff-455c-9aac-e32adcab851e","Type":"ContainerStarted","Data":"d9fcfd8748da5c290f602e5b1a9738b8fa0a559810526842ef6d99baf1b2b366"} Mar 01 09:50:02 crc kubenswrapper[4792]: I0301 09:50:02.491678 4792 generic.go:334] "Generic (PLEG): container finished" podID="bf147424-57ff-455c-9aac-e32adcab851e" containerID="3be8237f11ee8a9c2a66a6dce0cfdb5c72c7e1c7d5445dc46a852faf899f2940" exitCode=0 Mar 01 09:50:02 crc kubenswrapper[4792]: I0301 09:50:02.491721 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539310-cr6qh" event={"ID":"bf147424-57ff-455c-9aac-e32adcab851e","Type":"ContainerDied","Data":"3be8237f11ee8a9c2a66a6dce0cfdb5c72c7e1c7d5445dc46a852faf899f2940"} Mar 01 09:50:03 crc kubenswrapper[4792]: I0301 09:50:03.408595 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:50:03 crc kubenswrapper[4792]: E0301 09:50:03.409198 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:50:03 crc kubenswrapper[4792]: I0301 09:50:03.845576 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539310-cr6qh" Mar 01 09:50:03 crc kubenswrapper[4792]: I0301 09:50:03.924037 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-482sm\" (UniqueName: \"kubernetes.io/projected/bf147424-57ff-455c-9aac-e32adcab851e-kube-api-access-482sm\") pod \"bf147424-57ff-455c-9aac-e32adcab851e\" (UID: \"bf147424-57ff-455c-9aac-e32adcab851e\") " Mar 01 09:50:03 crc kubenswrapper[4792]: I0301 09:50:03.932800 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf147424-57ff-455c-9aac-e32adcab851e-kube-api-access-482sm" (OuterVolumeSpecName: "kube-api-access-482sm") pod "bf147424-57ff-455c-9aac-e32adcab851e" (UID: "bf147424-57ff-455c-9aac-e32adcab851e"). InnerVolumeSpecName "kube-api-access-482sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:50:04 crc kubenswrapper[4792]: I0301 09:50:04.027010 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-482sm\" (UniqueName: \"kubernetes.io/projected/bf147424-57ff-455c-9aac-e32adcab851e-kube-api-access-482sm\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:04 crc kubenswrapper[4792]: I0301 09:50:04.512277 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539310-cr6qh" event={"ID":"bf147424-57ff-455c-9aac-e32adcab851e","Type":"ContainerDied","Data":"d9fcfd8748da5c290f602e5b1a9738b8fa0a559810526842ef6d99baf1b2b366"} Mar 01 09:50:04 crc kubenswrapper[4792]: I0301 09:50:04.512325 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9fcfd8748da5c290f602e5b1a9738b8fa0a559810526842ef6d99baf1b2b366" Mar 01 09:50:04 crc kubenswrapper[4792]: I0301 09:50:04.512322 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539310-cr6qh" Mar 01 09:50:04 crc kubenswrapper[4792]: I0301 09:50:04.513864 4792 generic.go:334] "Generic (PLEG): container finished" podID="ff733b23-0a97-4623-9eeb-339aa02fc3b0" containerID="b3542fb65ba1732d969d1c1e12d706e23a7dab68ee9d6400fa7b59bdcc1e00eb" exitCode=0 Mar 01 09:50:04 crc kubenswrapper[4792]: I0301 09:50:04.514014 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" event={"ID":"ff733b23-0a97-4623-9eeb-339aa02fc3b0","Type":"ContainerDied","Data":"b3542fb65ba1732d969d1c1e12d706e23a7dab68ee9d6400fa7b59bdcc1e00eb"} Mar 01 09:50:04 crc kubenswrapper[4792]: I0301 09:50:04.918893 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539304-c2vn2"] Mar 01 09:50:04 crc kubenswrapper[4792]: I0301 09:50:04.926279 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539304-c2vn2"] Mar 01 09:50:05 crc kubenswrapper[4792]: I0301 09:50:05.419275 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97e68f99-8c1f-4046-bb89-66516bff6370" path="/var/lib/kubelet/pods/97e68f99-8c1f-4046-bb89-66516bff6370/volumes" Mar 01 09:50:05 crc kubenswrapper[4792]: I0301 09:50:05.919169 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" Mar 01 09:50:05 crc kubenswrapper[4792]: I0301 09:50:05.960767 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxwwx\" (UniqueName: \"kubernetes.io/projected/ff733b23-0a97-4623-9eeb-339aa02fc3b0-kube-api-access-cxwwx\") pod \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\" (UID: \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\") " Mar 01 09:50:05 crc kubenswrapper[4792]: I0301 09:50:05.961062 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff733b23-0a97-4623-9eeb-339aa02fc3b0-ceph\") pod \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\" (UID: \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\") " Mar 01 09:50:05 crc kubenswrapper[4792]: I0301 09:50:05.961101 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff733b23-0a97-4623-9eeb-339aa02fc3b0-inventory\") pod \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\" (UID: \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\") " Mar 01 09:50:05 crc kubenswrapper[4792]: I0301 09:50:05.961188 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ff733b23-0a97-4623-9eeb-339aa02fc3b0-ssh-key-openstack-edpm-ipam\") pod \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\" (UID: \"ff733b23-0a97-4623-9eeb-339aa02fc3b0\") " Mar 01 09:50:05 crc kubenswrapper[4792]: I0301 09:50:05.965890 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff733b23-0a97-4623-9eeb-339aa02fc3b0-ceph" (OuterVolumeSpecName: "ceph") pod "ff733b23-0a97-4623-9eeb-339aa02fc3b0" (UID: "ff733b23-0a97-4623-9eeb-339aa02fc3b0"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:50:05 crc kubenswrapper[4792]: I0301 09:50:05.966340 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff733b23-0a97-4623-9eeb-339aa02fc3b0-kube-api-access-cxwwx" (OuterVolumeSpecName: "kube-api-access-cxwwx") pod "ff733b23-0a97-4623-9eeb-339aa02fc3b0" (UID: "ff733b23-0a97-4623-9eeb-339aa02fc3b0"). InnerVolumeSpecName "kube-api-access-cxwwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:50:05 crc kubenswrapper[4792]: I0301 09:50:05.984678 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff733b23-0a97-4623-9eeb-339aa02fc3b0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ff733b23-0a97-4623-9eeb-339aa02fc3b0" (UID: "ff733b23-0a97-4623-9eeb-339aa02fc3b0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:50:05 crc kubenswrapper[4792]: I0301 09:50:05.989635 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff733b23-0a97-4623-9eeb-339aa02fc3b0-inventory" (OuterVolumeSpecName: "inventory") pod "ff733b23-0a97-4623-9eeb-339aa02fc3b0" (UID: "ff733b23-0a97-4623-9eeb-339aa02fc3b0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.063736 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff733b23-0a97-4623-9eeb-339aa02fc3b0-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.063763 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff733b23-0a97-4623-9eeb-339aa02fc3b0-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.063774 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ff733b23-0a97-4623-9eeb-339aa02fc3b0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.063783 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxwwx\" (UniqueName: \"kubernetes.io/projected/ff733b23-0a97-4623-9eeb-339aa02fc3b0-kube-api-access-cxwwx\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.576661 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" event={"ID":"ff733b23-0a97-4623-9eeb-339aa02fc3b0","Type":"ContainerDied","Data":"db6cb5df12916c3876aa7d66ae1518f0b04d2d2d57926fff5a1fa1d1f6b4ca19"} Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.576946 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db6cb5df12916c3876aa7d66ae1518f0b04d2d2d57926fff5a1fa1d1f6b4ca19" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.577126 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gxxr7" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.623060 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d"] Mar 01 09:50:06 crc kubenswrapper[4792]: E0301 09:50:06.623838 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf147424-57ff-455c-9aac-e32adcab851e" containerName="oc" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.623985 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf147424-57ff-455c-9aac-e32adcab851e" containerName="oc" Mar 01 09:50:06 crc kubenswrapper[4792]: E0301 09:50:06.624163 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff733b23-0a97-4623-9eeb-339aa02fc3b0" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.624266 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff733b23-0a97-4623-9eeb-339aa02fc3b0" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.624655 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff733b23-0a97-4623-9eeb-339aa02fc3b0" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.624798 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf147424-57ff-455c-9aac-e32adcab851e" containerName="oc" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.625790 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.632647 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.633722 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.633928 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.633991 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.634022 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.635235 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d"] Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.672817 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/34275228-a1ab-4955-9d16-d184643a86d1-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d\" (UID: \"34275228-a1ab-4955-9d16-d184643a86d1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.673248 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27xkg\" (UniqueName: \"kubernetes.io/projected/34275228-a1ab-4955-9d16-d184643a86d1-kube-api-access-27xkg\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d\" (UID: \"34275228-a1ab-4955-9d16-d184643a86d1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.673330 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/34275228-a1ab-4955-9d16-d184643a86d1-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d\" (UID: \"34275228-a1ab-4955-9d16-d184643a86d1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.673368 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34275228-a1ab-4955-9d16-d184643a86d1-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d\" (UID: \"34275228-a1ab-4955-9d16-d184643a86d1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.774921 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/34275228-a1ab-4955-9d16-d184643a86d1-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d\" (UID: \"34275228-a1ab-4955-9d16-d184643a86d1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.775066 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27xkg\" (UniqueName: \"kubernetes.io/projected/34275228-a1ab-4955-9d16-d184643a86d1-kube-api-access-27xkg\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d\" (UID: \"34275228-a1ab-4955-9d16-d184643a86d1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.775580 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/34275228-a1ab-4955-9d16-d184643a86d1-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d\" (UID: \"34275228-a1ab-4955-9d16-d184643a86d1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.775675 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34275228-a1ab-4955-9d16-d184643a86d1-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d\" (UID: \"34275228-a1ab-4955-9d16-d184643a86d1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.779218 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34275228-a1ab-4955-9d16-d184643a86d1-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d\" (UID: \"34275228-a1ab-4955-9d16-d184643a86d1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.779663 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/34275228-a1ab-4955-9d16-d184643a86d1-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d\" (UID: \"34275228-a1ab-4955-9d16-d184643a86d1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.779991 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/34275228-a1ab-4955-9d16-d184643a86d1-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d\" (UID: \"34275228-a1ab-4955-9d16-d184643a86d1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.792621 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27xkg\" (UniqueName: \"kubernetes.io/projected/34275228-a1ab-4955-9d16-d184643a86d1-kube-api-access-27xkg\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d\" (UID: \"34275228-a1ab-4955-9d16-d184643a86d1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" Mar 01 09:50:06 crc kubenswrapper[4792]: I0301 09:50:06.942180 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" Mar 01 09:50:07 crc kubenswrapper[4792]: I0301 09:50:07.424246 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d"] Mar 01 09:50:07 crc kubenswrapper[4792]: I0301 09:50:07.585554 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" event={"ID":"34275228-a1ab-4955-9d16-d184643a86d1","Type":"ContainerStarted","Data":"f1578421aff2f89fef2f6ea6ebd6c1d8cfc558aa833e2b5fc9ddfc26b93f7d1f"} Mar 01 09:50:08 crc kubenswrapper[4792]: I0301 09:50:08.597645 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" event={"ID":"34275228-a1ab-4955-9d16-d184643a86d1","Type":"ContainerStarted","Data":"aabeffd021257f62c9a2f4a842164b31c9cdc843be8d52afa279dac1f46235f3"} Mar 01 09:50:08 crc kubenswrapper[4792]: I0301 09:50:08.624820 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" podStartSLOduration=2.238836409 podStartE2EDuration="2.624790786s" podCreationTimestamp="2026-03-01 09:50:06 +0000 UTC" firstStartedPulling="2026-03-01 09:50:07.428278823 +0000 UTC m=+2536.670158030" lastFinishedPulling="2026-03-01 09:50:07.81423321 +0000 UTC m=+2537.056112407" observedRunningTime="2026-03-01 09:50:08.616190232 +0000 UTC m=+2537.858069469" watchObservedRunningTime="2026-03-01 09:50:08.624790786 +0000 UTC m=+2537.866670023" Mar 01 09:50:14 crc kubenswrapper[4792]: I0301 09:50:14.409191 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:50:14 crc kubenswrapper[4792]: E0301 09:50:14.410156 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:50:17 crc kubenswrapper[4792]: I0301 09:50:17.702313 4792 generic.go:334] "Generic (PLEG): container finished" podID="34275228-a1ab-4955-9d16-d184643a86d1" containerID="aabeffd021257f62c9a2f4a842164b31c9cdc843be8d52afa279dac1f46235f3" exitCode=0 Mar 01 09:50:17 crc kubenswrapper[4792]: I0301 09:50:17.702400 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" event={"ID":"34275228-a1ab-4955-9d16-d184643a86d1","Type":"ContainerDied","Data":"aabeffd021257f62c9a2f4a842164b31c9cdc843be8d52afa279dac1f46235f3"} Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.148425 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.316453 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/34275228-a1ab-4955-9d16-d184643a86d1-ceph\") pod \"34275228-a1ab-4955-9d16-d184643a86d1\" (UID: \"34275228-a1ab-4955-9d16-d184643a86d1\") " Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.316645 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34275228-a1ab-4955-9d16-d184643a86d1-inventory\") pod \"34275228-a1ab-4955-9d16-d184643a86d1\" (UID: \"34275228-a1ab-4955-9d16-d184643a86d1\") " Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.316689 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/34275228-a1ab-4955-9d16-d184643a86d1-ssh-key-openstack-edpm-ipam\") pod \"34275228-a1ab-4955-9d16-d184643a86d1\" (UID: \"34275228-a1ab-4955-9d16-d184643a86d1\") " Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.316832 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27xkg\" (UniqueName: \"kubernetes.io/projected/34275228-a1ab-4955-9d16-d184643a86d1-kube-api-access-27xkg\") pod \"34275228-a1ab-4955-9d16-d184643a86d1\" (UID: \"34275228-a1ab-4955-9d16-d184643a86d1\") " Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.322409 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34275228-a1ab-4955-9d16-d184643a86d1-kube-api-access-27xkg" (OuterVolumeSpecName: "kube-api-access-27xkg") pod "34275228-a1ab-4955-9d16-d184643a86d1" (UID: "34275228-a1ab-4955-9d16-d184643a86d1"). InnerVolumeSpecName "kube-api-access-27xkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.323034 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34275228-a1ab-4955-9d16-d184643a86d1-ceph" (OuterVolumeSpecName: "ceph") pod "34275228-a1ab-4955-9d16-d184643a86d1" (UID: "34275228-a1ab-4955-9d16-d184643a86d1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.341272 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34275228-a1ab-4955-9d16-d184643a86d1-inventory" (OuterVolumeSpecName: "inventory") pod "34275228-a1ab-4955-9d16-d184643a86d1" (UID: "34275228-a1ab-4955-9d16-d184643a86d1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.344049 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34275228-a1ab-4955-9d16-d184643a86d1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "34275228-a1ab-4955-9d16-d184643a86d1" (UID: "34275228-a1ab-4955-9d16-d184643a86d1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.419470 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27xkg\" (UniqueName: \"kubernetes.io/projected/34275228-a1ab-4955-9d16-d184643a86d1-kube-api-access-27xkg\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.419504 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/34275228-a1ab-4955-9d16-d184643a86d1-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.419519 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34275228-a1ab-4955-9d16-d184643a86d1-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.419531 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/34275228-a1ab-4955-9d16-d184643a86d1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.755520 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.755559 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d" event={"ID":"34275228-a1ab-4955-9d16-d184643a86d1","Type":"ContainerDied","Data":"f1578421aff2f89fef2f6ea6ebd6c1d8cfc558aa833e2b5fc9ddfc26b93f7d1f"} Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.755898 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1578421aff2f89fef2f6ea6ebd6c1d8cfc558aa833e2b5fc9ddfc26b93f7d1f" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.818245 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh"] Mar 01 09:50:19 crc kubenswrapper[4792]: E0301 09:50:19.818841 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34275228-a1ab-4955-9d16-d184643a86d1" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.818938 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="34275228-a1ab-4955-9d16-d184643a86d1" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.819229 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="34275228-a1ab-4955-9d16-d184643a86d1" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.820226 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.825176 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.825402 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.825504 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.825215 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.825269 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.825359 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.825364 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.830496 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.833107 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh"] Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.850942 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg97l\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-kube-api-access-sg97l\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.850993 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.851073 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.851099 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.851135 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.851155 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.851189 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.851213 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.851240 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.851263 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.851284 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.851312 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.851339 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.952450 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.952506 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.953174 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.953202 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.953257 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.953298 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.953323 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.953342 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.953371 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.953400 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.953430 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.953450 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg97l\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-kube-api-access-sg97l\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.953470 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.958595 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.959398 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.960305 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.960430 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.960434 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.962103 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.965303 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.965444 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.971519 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.974414 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg97l\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-kube-api-access-sg97l\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.975055 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.975288 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:19 crc kubenswrapper[4792]: I0301 09:50:19.976496 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xqprh\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:20 crc kubenswrapper[4792]: I0301 09:50:20.134479 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:20 crc kubenswrapper[4792]: I0301 09:50:20.629133 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh"] Mar 01 09:50:20 crc kubenswrapper[4792]: I0301 09:50:20.764334 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" event={"ID":"d11c64e6-0562-41d9-a213-f1c5749b4c83","Type":"ContainerStarted","Data":"e19e492b8b4c3b1acedd50eee438177fb887e813c2336c4b49ea100012ecdfaf"} Mar 01 09:50:21 crc kubenswrapper[4792]: I0301 09:50:21.773092 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" event={"ID":"d11c64e6-0562-41d9-a213-f1c5749b4c83","Type":"ContainerStarted","Data":"662eb199da8ca1b1a64e3ef37c17726d184337944bb8aabe6168f79569cae95a"} Mar 01 09:50:21 crc kubenswrapper[4792]: I0301 09:50:21.797381 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" podStartSLOduration=2.384096943 podStartE2EDuration="2.797361379s" podCreationTimestamp="2026-03-01 09:50:19 +0000 UTC" firstStartedPulling="2026-03-01 09:50:20.639484672 +0000 UTC m=+2549.881363869" lastFinishedPulling="2026-03-01 09:50:21.052749118 +0000 UTC m=+2550.294628305" observedRunningTime="2026-03-01 09:50:21.787203847 +0000 UTC m=+2551.029083044" watchObservedRunningTime="2026-03-01 09:50:21.797361379 +0000 UTC m=+2551.039240566" Mar 01 09:50:28 crc kubenswrapper[4792]: I0301 09:50:28.408644 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:50:28 crc kubenswrapper[4792]: E0301 09:50:28.409308 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:50:31 crc kubenswrapper[4792]: I0301 09:50:31.220736 4792 scope.go:117] "RemoveContainer" containerID="3cbaa243041e250919798684d495339949ab384e80a45c460f4d0e0c2cfab407" Mar 01 09:50:41 crc kubenswrapper[4792]: I0301 09:50:41.417955 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:50:41 crc kubenswrapper[4792]: E0301 09:50:41.418636 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:50:52 crc kubenswrapper[4792]: I0301 09:50:52.998892 4792 generic.go:334] "Generic (PLEG): container finished" podID="d11c64e6-0562-41d9-a213-f1c5749b4c83" containerID="662eb199da8ca1b1a64e3ef37c17726d184337944bb8aabe6168f79569cae95a" exitCode=0 Mar 01 09:50:53 crc kubenswrapper[4792]: I0301 09:50:52.999000 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" event={"ID":"d11c64e6-0562-41d9-a213-f1c5749b4c83","Type":"ContainerDied","Data":"662eb199da8ca1b1a64e3ef37c17726d184337944bb8aabe6168f79569cae95a"} Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.467738 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.477210 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-nova-combined-ca-bundle\") pod \"d11c64e6-0562-41d9-a213-f1c5749b4c83\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.482150 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "d11c64e6-0562-41d9-a213-f1c5749b4c83" (UID: "d11c64e6-0562-41d9-a213-f1c5749b4c83"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.578744 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-ssh-key-openstack-edpm-ipam\") pod \"d11c64e6-0562-41d9-a213-f1c5749b4c83\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.578881 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"d11c64e6-0562-41d9-a213-f1c5749b4c83\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.578906 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-ceph\") pod \"d11c64e6-0562-41d9-a213-f1c5749b4c83\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.578949 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-libvirt-combined-ca-bundle\") pod \"d11c64e6-0562-41d9-a213-f1c5749b4c83\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.578971 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg97l\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-kube-api-access-sg97l\") pod \"d11c64e6-0562-41d9-a213-f1c5749b4c83\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.579000 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"d11c64e6-0562-41d9-a213-f1c5749b4c83\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.579158 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-openstack-edpm-ipam-ovn-default-certs-0\") pod \"d11c64e6-0562-41d9-a213-f1c5749b4c83\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.579186 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-inventory\") pod \"d11c64e6-0562-41d9-a213-f1c5749b4c83\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.579517 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-neutron-metadata-combined-ca-bundle\") pod \"d11c64e6-0562-41d9-a213-f1c5749b4c83\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.579578 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-bootstrap-combined-ca-bundle\") pod \"d11c64e6-0562-41d9-a213-f1c5749b4c83\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.579627 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-ovn-combined-ca-bundle\") pod \"d11c64e6-0562-41d9-a213-f1c5749b4c83\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.579674 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-repo-setup-combined-ca-bundle\") pod \"d11c64e6-0562-41d9-a213-f1c5749b4c83\" (UID: \"d11c64e6-0562-41d9-a213-f1c5749b4c83\") " Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.580115 4792 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.582458 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "d11c64e6-0562-41d9-a213-f1c5749b4c83" (UID: "d11c64e6-0562-41d9-a213-f1c5749b4c83"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.583203 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "d11c64e6-0562-41d9-a213-f1c5749b4c83" (UID: "d11c64e6-0562-41d9-a213-f1c5749b4c83"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.584079 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-kube-api-access-sg97l" (OuterVolumeSpecName: "kube-api-access-sg97l") pod "d11c64e6-0562-41d9-a213-f1c5749b4c83" (UID: "d11c64e6-0562-41d9-a213-f1c5749b4c83"). InnerVolumeSpecName "kube-api-access-sg97l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.584546 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "d11c64e6-0562-41d9-a213-f1c5749b4c83" (UID: "d11c64e6-0562-41d9-a213-f1c5749b4c83"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.585592 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "d11c64e6-0562-41d9-a213-f1c5749b4c83" (UID: "d11c64e6-0562-41d9-a213-f1c5749b4c83"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.586543 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-ceph" (OuterVolumeSpecName: "ceph") pod "d11c64e6-0562-41d9-a213-f1c5749b4c83" (UID: "d11c64e6-0562-41d9-a213-f1c5749b4c83"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.587197 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "d11c64e6-0562-41d9-a213-f1c5749b4c83" (UID: "d11c64e6-0562-41d9-a213-f1c5749b4c83"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.589504 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "d11c64e6-0562-41d9-a213-f1c5749b4c83" (UID: "d11c64e6-0562-41d9-a213-f1c5749b4c83"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.601577 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "d11c64e6-0562-41d9-a213-f1c5749b4c83" (UID: "d11c64e6-0562-41d9-a213-f1c5749b4c83"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.602980 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "d11c64e6-0562-41d9-a213-f1c5749b4c83" (UID: "d11c64e6-0562-41d9-a213-f1c5749b4c83"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.608996 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d11c64e6-0562-41d9-a213-f1c5749b4c83" (UID: "d11c64e6-0562-41d9-a213-f1c5749b4c83"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.618100 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-inventory" (OuterVolumeSpecName: "inventory") pod "d11c64e6-0562-41d9-a213-f1c5749b4c83" (UID: "d11c64e6-0562-41d9-a213-f1c5749b4c83"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.681750 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.681970 4792 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.682072 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.682135 4792 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.682191 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sg97l\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-kube-api-access-sg97l\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.682245 4792 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.682305 4792 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d11c64e6-0562-41d9-a213-f1c5749b4c83-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.682360 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.682417 4792 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.682580 4792 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.682731 4792 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:54 crc kubenswrapper[4792]: I0301 09:50:54.682798 4792 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c64e6-0562-41d9-a213-f1c5749b4c83-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.016359 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" event={"ID":"d11c64e6-0562-41d9-a213-f1c5749b4c83","Type":"ContainerDied","Data":"e19e492b8b4c3b1acedd50eee438177fb887e813c2336c4b49ea100012ecdfaf"} Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.016399 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e19e492b8b4c3b1acedd50eee438177fb887e813c2336c4b49ea100012ecdfaf" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.016402 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xqprh" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.205265 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj"] Mar 01 09:50:55 crc kubenswrapper[4792]: E0301 09:50:55.205861 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d11c64e6-0562-41d9-a213-f1c5749b4c83" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.205883 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11c64e6-0562-41d9-a213-f1c5749b4c83" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.206067 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d11c64e6-0562-41d9-a213-f1c5749b4c83" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.206609 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.210749 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.211434 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.211684 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.212993 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.213311 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.236042 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj"] Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.291747 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f3a428e9-b35d-4f80-bb40-c158095d5bfa-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj\" (UID: \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.291869 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f3a428e9-b35d-4f80-bb40-c158095d5bfa-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj\" (UID: \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.291927 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3a428e9-b35d-4f80-bb40-c158095d5bfa-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj\" (UID: \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.291972 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjgs4\" (UniqueName: \"kubernetes.io/projected/f3a428e9-b35d-4f80-bb40-c158095d5bfa-kube-api-access-sjgs4\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj\" (UID: \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.393151 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f3a428e9-b35d-4f80-bb40-c158095d5bfa-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj\" (UID: \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.393256 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f3a428e9-b35d-4f80-bb40-c158095d5bfa-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj\" (UID: \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.393288 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3a428e9-b35d-4f80-bb40-c158095d5bfa-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj\" (UID: \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.393330 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjgs4\" (UniqueName: \"kubernetes.io/projected/f3a428e9-b35d-4f80-bb40-c158095d5bfa-kube-api-access-sjgs4\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj\" (UID: \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.397989 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f3a428e9-b35d-4f80-bb40-c158095d5bfa-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj\" (UID: \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.397996 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3a428e9-b35d-4f80-bb40-c158095d5bfa-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj\" (UID: \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.408034 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f3a428e9-b35d-4f80-bb40-c158095d5bfa-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj\" (UID: \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.409153 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:50:55 crc kubenswrapper[4792]: E0301 09:50:55.409417 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.410707 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjgs4\" (UniqueName: \"kubernetes.io/projected/f3a428e9-b35d-4f80-bb40-c158095d5bfa-kube-api-access-sjgs4\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj\" (UID: \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" Mar 01 09:50:55 crc kubenswrapper[4792]: I0301 09:50:55.520632 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" Mar 01 09:50:56 crc kubenswrapper[4792]: W0301 09:50:56.024165 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3a428e9_b35d_4f80_bb40_c158095d5bfa.slice/crio-35fac49500e6f43757e1ca0a464c6df2c17291c1695abd5134f47b05e6e3347a WatchSource:0}: Error finding container 35fac49500e6f43757e1ca0a464c6df2c17291c1695abd5134f47b05e6e3347a: Status 404 returned error can't find the container with id 35fac49500e6f43757e1ca0a464c6df2c17291c1695abd5134f47b05e6e3347a Mar 01 09:50:56 crc kubenswrapper[4792]: I0301 09:50:56.024607 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj"] Mar 01 09:50:56 crc kubenswrapper[4792]: I0301 09:50:56.027177 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 01 09:50:57 crc kubenswrapper[4792]: I0301 09:50:57.032661 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" event={"ID":"f3a428e9-b35d-4f80-bb40-c158095d5bfa","Type":"ContainerStarted","Data":"f56e4a269d477fb54657636d48bf2e78c700a87055d56a6f2355bd8762f25fe7"} Mar 01 09:50:57 crc kubenswrapper[4792]: I0301 09:50:57.035146 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" event={"ID":"f3a428e9-b35d-4f80-bb40-c158095d5bfa","Type":"ContainerStarted","Data":"35fac49500e6f43757e1ca0a464c6df2c17291c1695abd5134f47b05e6e3347a"} Mar 01 09:50:58 crc kubenswrapper[4792]: I0301 09:50:58.063083 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" podStartSLOduration=2.313029307 podStartE2EDuration="3.063055782s" podCreationTimestamp="2026-03-01 09:50:55 +0000 UTC" firstStartedPulling="2026-03-01 09:50:56.026947773 +0000 UTC m=+2585.268826970" lastFinishedPulling="2026-03-01 09:50:56.776974248 +0000 UTC m=+2586.018853445" observedRunningTime="2026-03-01 09:50:58.059108684 +0000 UTC m=+2587.300987881" watchObservedRunningTime="2026-03-01 09:50:58.063055782 +0000 UTC m=+2587.304934979" Mar 01 09:51:03 crc kubenswrapper[4792]: I0301 09:51:03.087355 4792 generic.go:334] "Generic (PLEG): container finished" podID="f3a428e9-b35d-4f80-bb40-c158095d5bfa" containerID="f56e4a269d477fb54657636d48bf2e78c700a87055d56a6f2355bd8762f25fe7" exitCode=0 Mar 01 09:51:03 crc kubenswrapper[4792]: I0301 09:51:03.087388 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" event={"ID":"f3a428e9-b35d-4f80-bb40-c158095d5bfa","Type":"ContainerDied","Data":"f56e4a269d477fb54657636d48bf2e78c700a87055d56a6f2355bd8762f25fe7"} Mar 01 09:51:04 crc kubenswrapper[4792]: I0301 09:51:04.477165 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" Mar 01 09:51:04 crc kubenswrapper[4792]: I0301 09:51:04.520945 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f3a428e9-b35d-4f80-bb40-c158095d5bfa-ceph\") pod \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\" (UID: \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\") " Mar 01 09:51:04 crc kubenswrapper[4792]: I0301 09:51:04.521156 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f3a428e9-b35d-4f80-bb40-c158095d5bfa-ssh-key-openstack-edpm-ipam\") pod \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\" (UID: \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\") " Mar 01 09:51:04 crc kubenswrapper[4792]: I0301 09:51:04.521216 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjgs4\" (UniqueName: \"kubernetes.io/projected/f3a428e9-b35d-4f80-bb40-c158095d5bfa-kube-api-access-sjgs4\") pod \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\" (UID: \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\") " Mar 01 09:51:04 crc kubenswrapper[4792]: I0301 09:51:04.521263 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3a428e9-b35d-4f80-bb40-c158095d5bfa-inventory\") pod \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\" (UID: \"f3a428e9-b35d-4f80-bb40-c158095d5bfa\") " Mar 01 09:51:04 crc kubenswrapper[4792]: I0301 09:51:04.526275 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3a428e9-b35d-4f80-bb40-c158095d5bfa-kube-api-access-sjgs4" (OuterVolumeSpecName: "kube-api-access-sjgs4") pod "f3a428e9-b35d-4f80-bb40-c158095d5bfa" (UID: "f3a428e9-b35d-4f80-bb40-c158095d5bfa"). InnerVolumeSpecName "kube-api-access-sjgs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:51:04 crc kubenswrapper[4792]: I0301 09:51:04.526336 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3a428e9-b35d-4f80-bb40-c158095d5bfa-ceph" (OuterVolumeSpecName: "ceph") pod "f3a428e9-b35d-4f80-bb40-c158095d5bfa" (UID: "f3a428e9-b35d-4f80-bb40-c158095d5bfa"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:51:04 crc kubenswrapper[4792]: I0301 09:51:04.545418 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3a428e9-b35d-4f80-bb40-c158095d5bfa-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f3a428e9-b35d-4f80-bb40-c158095d5bfa" (UID: "f3a428e9-b35d-4f80-bb40-c158095d5bfa"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:51:04 crc kubenswrapper[4792]: I0301 09:51:04.546101 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3a428e9-b35d-4f80-bb40-c158095d5bfa-inventory" (OuterVolumeSpecName: "inventory") pod "f3a428e9-b35d-4f80-bb40-c158095d5bfa" (UID: "f3a428e9-b35d-4f80-bb40-c158095d5bfa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:51:04 crc kubenswrapper[4792]: I0301 09:51:04.623425 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f3a428e9-b35d-4f80-bb40-c158095d5bfa-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 09:51:04 crc kubenswrapper[4792]: I0301 09:51:04.623463 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f3a428e9-b35d-4f80-bb40-c158095d5bfa-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:51:04 crc kubenswrapper[4792]: I0301 09:51:04.623474 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjgs4\" (UniqueName: \"kubernetes.io/projected/f3a428e9-b35d-4f80-bb40-c158095d5bfa-kube-api-access-sjgs4\") on node \"crc\" DevicePath \"\"" Mar 01 09:51:04 crc kubenswrapper[4792]: I0301 09:51:04.623484 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3a428e9-b35d-4f80-bb40-c158095d5bfa-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.132231 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" event={"ID":"f3a428e9-b35d-4f80-bb40-c158095d5bfa","Type":"ContainerDied","Data":"35fac49500e6f43757e1ca0a464c6df2c17291c1695abd5134f47b05e6e3347a"} Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.132298 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35fac49500e6f43757e1ca0a464c6df2c17291c1695abd5134f47b05e6e3347a" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.132400 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.231383 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl"] Mar 01 09:51:05 crc kubenswrapper[4792]: E0301 09:51:05.231877 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3a428e9-b35d-4f80-bb40-c158095d5bfa" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.231986 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3a428e9-b35d-4f80-bb40-c158095d5bfa" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.232309 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3a428e9-b35d-4f80-bb40-c158095d5bfa" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.233018 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.237589 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.237632 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.238154 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.238239 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.239245 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.240572 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.242522 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl"] Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.336161 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc5rl\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.336467 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt4tm\" (UniqueName: \"kubernetes.io/projected/e4b8a64b-6bea-426c-b1f5-2372342d4211-kube-api-access-mt4tm\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc5rl\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.336640 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc5rl\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.336753 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e4b8a64b-6bea-426c-b1f5-2372342d4211-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc5rl\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.337355 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc5rl\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.337533 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc5rl\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.439327 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc5rl\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.439411 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt4tm\" (UniqueName: \"kubernetes.io/projected/e4b8a64b-6bea-426c-b1f5-2372342d4211-kube-api-access-mt4tm\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc5rl\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.439458 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc5rl\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.439486 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e4b8a64b-6bea-426c-b1f5-2372342d4211-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc5rl\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.439530 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc5rl\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.439594 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc5rl\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.440565 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e4b8a64b-6bea-426c-b1f5-2372342d4211-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc5rl\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.447560 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc5rl\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.448246 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc5rl\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.449338 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc5rl\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.450468 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc5rl\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.463445 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt4tm\" (UniqueName: \"kubernetes.io/projected/e4b8a64b-6bea-426c-b1f5-2372342d4211-kube-api-access-mt4tm\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bc5rl\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:05 crc kubenswrapper[4792]: I0301 09:51:05.556387 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:51:06 crc kubenswrapper[4792]: I0301 09:51:06.062260 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl"] Mar 01 09:51:06 crc kubenswrapper[4792]: I0301 09:51:06.140944 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" event={"ID":"e4b8a64b-6bea-426c-b1f5-2372342d4211","Type":"ContainerStarted","Data":"c86975f99bba1488fcf86272ae150d1c14d78a7ddfa1c7f9f1e74f6697ff114d"} Mar 01 09:51:07 crc kubenswrapper[4792]: I0301 09:51:07.153552 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" event={"ID":"e4b8a64b-6bea-426c-b1f5-2372342d4211","Type":"ContainerStarted","Data":"23442558607cc4d8bcc03bb7871d1f97928a51b594e8f3c0b6969fd7fedbf63d"} Mar 01 09:51:10 crc kubenswrapper[4792]: I0301 09:51:10.409110 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:51:10 crc kubenswrapper[4792]: E0301 09:51:10.409840 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:51:21 crc kubenswrapper[4792]: I0301 09:51:21.414728 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:51:21 crc kubenswrapper[4792]: E0301 09:51:21.415598 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:51:35 crc kubenswrapper[4792]: I0301 09:51:35.409038 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:51:35 crc kubenswrapper[4792]: E0301 09:51:35.410001 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:51:46 crc kubenswrapper[4792]: I0301 09:51:46.544775 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:51:46 crc kubenswrapper[4792]: E0301 09:51:46.569684 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:52:00 crc kubenswrapper[4792]: I0301 09:52:00.154861 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" podStartSLOduration=54.776781499 podStartE2EDuration="55.154843522s" podCreationTimestamp="2026-03-01 09:51:05 +0000 UTC" firstStartedPulling="2026-03-01 09:51:06.064769059 +0000 UTC m=+2595.306648256" lastFinishedPulling="2026-03-01 09:51:06.442831082 +0000 UTC m=+2595.684710279" observedRunningTime="2026-03-01 09:51:07.175147328 +0000 UTC m=+2596.417026525" watchObservedRunningTime="2026-03-01 09:52:00.154843522 +0000 UTC m=+2649.396722719" Mar 01 09:52:00 crc kubenswrapper[4792]: I0301 09:52:00.158002 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539312-tpqkw"] Mar 01 09:52:00 crc kubenswrapper[4792]: I0301 09:52:00.160551 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539312-tpqkw" Mar 01 09:52:00 crc kubenswrapper[4792]: I0301 09:52:00.185115 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:52:00 crc kubenswrapper[4792]: I0301 09:52:00.185363 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:52:00 crc kubenswrapper[4792]: I0301 09:52:00.185895 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:52:00 crc kubenswrapper[4792]: I0301 09:52:00.217151 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539312-tpqkw"] Mar 01 09:52:00 crc kubenswrapper[4792]: I0301 09:52:00.242360 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn774\" (UniqueName: \"kubernetes.io/projected/11abc020-6c8a-4de3-8afc-229196293ab0-kube-api-access-mn774\") pod \"auto-csr-approver-29539312-tpqkw\" (UID: \"11abc020-6c8a-4de3-8afc-229196293ab0\") " pod="openshift-infra/auto-csr-approver-29539312-tpqkw" Mar 01 09:52:00 crc kubenswrapper[4792]: I0301 09:52:00.343949 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn774\" (UniqueName: \"kubernetes.io/projected/11abc020-6c8a-4de3-8afc-229196293ab0-kube-api-access-mn774\") pod \"auto-csr-approver-29539312-tpqkw\" (UID: \"11abc020-6c8a-4de3-8afc-229196293ab0\") " pod="openshift-infra/auto-csr-approver-29539312-tpqkw" Mar 01 09:52:00 crc kubenswrapper[4792]: I0301 09:52:00.372181 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn774\" (UniqueName: \"kubernetes.io/projected/11abc020-6c8a-4de3-8afc-229196293ab0-kube-api-access-mn774\") pod \"auto-csr-approver-29539312-tpqkw\" (UID: \"11abc020-6c8a-4de3-8afc-229196293ab0\") " pod="openshift-infra/auto-csr-approver-29539312-tpqkw" Mar 01 09:52:00 crc kubenswrapper[4792]: I0301 09:52:00.500706 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539312-tpqkw" Mar 01 09:52:00 crc kubenswrapper[4792]: I0301 09:52:00.969679 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539312-tpqkw"] Mar 01 09:52:01 crc kubenswrapper[4792]: I0301 09:52:01.418570 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:52:01 crc kubenswrapper[4792]: E0301 09:52:01.419054 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:52:01 crc kubenswrapper[4792]: I0301 09:52:01.711788 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539312-tpqkw" event={"ID":"11abc020-6c8a-4de3-8afc-229196293ab0","Type":"ContainerStarted","Data":"4fba4a423da85678a96691dddd15b5d9d730ca82d3b8aec89a60069a039f8a2c"} Mar 01 09:52:02 crc kubenswrapper[4792]: I0301 09:52:02.722999 4792 generic.go:334] "Generic (PLEG): container finished" podID="11abc020-6c8a-4de3-8afc-229196293ab0" containerID="89836f5e70f069d0d23a66a3f24b77f6002210b440a744a89543043e75793243" exitCode=0 Mar 01 09:52:02 crc kubenswrapper[4792]: I0301 09:52:02.723045 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539312-tpqkw" event={"ID":"11abc020-6c8a-4de3-8afc-229196293ab0","Type":"ContainerDied","Data":"89836f5e70f069d0d23a66a3f24b77f6002210b440a744a89543043e75793243"} Mar 01 09:52:04 crc kubenswrapper[4792]: I0301 09:52:04.007054 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539312-tpqkw" Mar 01 09:52:04 crc kubenswrapper[4792]: I0301 09:52:04.208460 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mn774\" (UniqueName: \"kubernetes.io/projected/11abc020-6c8a-4de3-8afc-229196293ab0-kube-api-access-mn774\") pod \"11abc020-6c8a-4de3-8afc-229196293ab0\" (UID: \"11abc020-6c8a-4de3-8afc-229196293ab0\") " Mar 01 09:52:04 crc kubenswrapper[4792]: I0301 09:52:04.214192 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11abc020-6c8a-4de3-8afc-229196293ab0-kube-api-access-mn774" (OuterVolumeSpecName: "kube-api-access-mn774") pod "11abc020-6c8a-4de3-8afc-229196293ab0" (UID: "11abc020-6c8a-4de3-8afc-229196293ab0"). InnerVolumeSpecName "kube-api-access-mn774". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:52:04 crc kubenswrapper[4792]: I0301 09:52:04.311659 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mn774\" (UniqueName: \"kubernetes.io/projected/11abc020-6c8a-4de3-8afc-229196293ab0-kube-api-access-mn774\") on node \"crc\" DevicePath \"\"" Mar 01 09:52:04 crc kubenswrapper[4792]: I0301 09:52:04.740627 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539312-tpqkw" event={"ID":"11abc020-6c8a-4de3-8afc-229196293ab0","Type":"ContainerDied","Data":"4fba4a423da85678a96691dddd15b5d9d730ca82d3b8aec89a60069a039f8a2c"} Mar 01 09:52:04 crc kubenswrapper[4792]: I0301 09:52:04.740670 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fba4a423da85678a96691dddd15b5d9d730ca82d3b8aec89a60069a039f8a2c" Mar 01 09:52:04 crc kubenswrapper[4792]: I0301 09:52:04.740720 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539312-tpqkw" Mar 01 09:52:05 crc kubenswrapper[4792]: I0301 09:52:05.072985 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539306-tt9nk"] Mar 01 09:52:05 crc kubenswrapper[4792]: I0301 09:52:05.080587 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539306-tt9nk"] Mar 01 09:52:05 crc kubenswrapper[4792]: I0301 09:52:05.420559 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5d29db8-7573-4364-9e18-20658b790d1f" path="/var/lib/kubelet/pods/f5d29db8-7573-4364-9e18-20658b790d1f/volumes" Mar 01 09:52:13 crc kubenswrapper[4792]: I0301 09:52:13.409877 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:52:13 crc kubenswrapper[4792]: E0301 09:52:13.410751 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:52:21 crc kubenswrapper[4792]: I0301 09:52:21.879445 4792 generic.go:334] "Generic (PLEG): container finished" podID="e4b8a64b-6bea-426c-b1f5-2372342d4211" containerID="23442558607cc4d8bcc03bb7871d1f97928a51b594e8f3c0b6969fd7fedbf63d" exitCode=0 Mar 01 09:52:21 crc kubenswrapper[4792]: I0301 09:52:21.879588 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" event={"ID":"e4b8a64b-6bea-426c-b1f5-2372342d4211","Type":"ContainerDied","Data":"23442558607cc4d8bcc03bb7871d1f97928a51b594e8f3c0b6969fd7fedbf63d"} Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.275714 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.473254 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-ceph\") pod \"e4b8a64b-6bea-426c-b1f5-2372342d4211\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.473315 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-inventory\") pod \"e4b8a64b-6bea-426c-b1f5-2372342d4211\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.474279 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-ssh-key-openstack-edpm-ipam\") pod \"e4b8a64b-6bea-426c-b1f5-2372342d4211\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.474322 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt4tm\" (UniqueName: \"kubernetes.io/projected/e4b8a64b-6bea-426c-b1f5-2372342d4211-kube-api-access-mt4tm\") pod \"e4b8a64b-6bea-426c-b1f5-2372342d4211\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.474382 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e4b8a64b-6bea-426c-b1f5-2372342d4211-ovncontroller-config-0\") pod \"e4b8a64b-6bea-426c-b1f5-2372342d4211\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.474415 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-ovn-combined-ca-bundle\") pod \"e4b8a64b-6bea-426c-b1f5-2372342d4211\" (UID: \"e4b8a64b-6bea-426c-b1f5-2372342d4211\") " Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.479499 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4b8a64b-6bea-426c-b1f5-2372342d4211-kube-api-access-mt4tm" (OuterVolumeSpecName: "kube-api-access-mt4tm") pod "e4b8a64b-6bea-426c-b1f5-2372342d4211" (UID: "e4b8a64b-6bea-426c-b1f5-2372342d4211"). InnerVolumeSpecName "kube-api-access-mt4tm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.480453 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-ceph" (OuterVolumeSpecName: "ceph") pod "e4b8a64b-6bea-426c-b1f5-2372342d4211" (UID: "e4b8a64b-6bea-426c-b1f5-2372342d4211"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.480897 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "e4b8a64b-6bea-426c-b1f5-2372342d4211" (UID: "e4b8a64b-6bea-426c-b1f5-2372342d4211"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.498864 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4b8a64b-6bea-426c-b1f5-2372342d4211-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "e4b8a64b-6bea-426c-b1f5-2372342d4211" (UID: "e4b8a64b-6bea-426c-b1f5-2372342d4211"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.499478 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-inventory" (OuterVolumeSpecName: "inventory") pod "e4b8a64b-6bea-426c-b1f5-2372342d4211" (UID: "e4b8a64b-6bea-426c-b1f5-2372342d4211"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.507687 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e4b8a64b-6bea-426c-b1f5-2372342d4211" (UID: "e4b8a64b-6bea-426c-b1f5-2372342d4211"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.576562 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.576601 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.576613 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.576623 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mt4tm\" (UniqueName: \"kubernetes.io/projected/e4b8a64b-6bea-426c-b1f5-2372342d4211-kube-api-access-mt4tm\") on node \"crc\" DevicePath \"\"" Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.576632 4792 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e4b8a64b-6bea-426c-b1f5-2372342d4211-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.576640 4792 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4b8a64b-6bea-426c-b1f5-2372342d4211-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.895837 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" event={"ID":"e4b8a64b-6bea-426c-b1f5-2372342d4211","Type":"ContainerDied","Data":"c86975f99bba1488fcf86272ae150d1c14d78a7ddfa1c7f9f1e74f6697ff114d"} Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.895882 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c86975f99bba1488fcf86272ae150d1c14d78a7ddfa1c7f9f1e74f6697ff114d" Mar 01 09:52:23 crc kubenswrapper[4792]: I0301 09:52:23.895900 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bc5rl" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.014214 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt"] Mar 01 09:52:24 crc kubenswrapper[4792]: E0301 09:52:24.014571 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11abc020-6c8a-4de3-8afc-229196293ab0" containerName="oc" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.014586 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="11abc020-6c8a-4de3-8afc-229196293ab0" containerName="oc" Mar 01 09:52:24 crc kubenswrapper[4792]: E0301 09:52:24.014600 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4b8a64b-6bea-426c-b1f5-2372342d4211" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.014608 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4b8a64b-6bea-426c-b1f5-2372342d4211" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.014779 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="11abc020-6c8a-4de3-8afc-229196293ab0" containerName="oc" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.014800 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4b8a64b-6bea-426c-b1f5-2372342d4211" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.015425 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.017748 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.017900 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.018009 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.018575 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.018834 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.019166 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.019407 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.033840 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt"] Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.186344 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.186413 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.186453 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.186481 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.187038 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr4f2\" (UniqueName: \"kubernetes.io/projected/f737af00-5e6f-4a95-bf94-738b72990ebd-kube-api-access-cr4f2\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.187087 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.187181 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.288869 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.289414 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.289479 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.289526 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.289556 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.289594 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr4f2\" (UniqueName: \"kubernetes.io/projected/f737af00-5e6f-4a95-bf94-738b72990ebd-kube-api-access-cr4f2\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.289629 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.292804 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.293145 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.294563 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.295678 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.298492 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.300116 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.310789 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr4f2\" (UniqueName: \"kubernetes.io/projected/f737af00-5e6f-4a95-bf94-738b72990ebd-kube-api-access-cr4f2\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.364522 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.684838 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt"] Mar 01 09:52:24 crc kubenswrapper[4792]: I0301 09:52:24.919327 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" event={"ID":"f737af00-5e6f-4a95-bf94-738b72990ebd","Type":"ContainerStarted","Data":"87d887549298c09ffe2d57352b333fabf0848f6fb2f69c640fc971765e3e3259"} Mar 01 09:52:25 crc kubenswrapper[4792]: I0301 09:52:25.928120 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" event={"ID":"f737af00-5e6f-4a95-bf94-738b72990ebd","Type":"ContainerStarted","Data":"b89dd5932c826da6e3e27b6cf0817bc475f510c4beb76bbcb78cfad314ff427c"} Mar 01 09:52:25 crc kubenswrapper[4792]: I0301 09:52:25.955019 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" podStartSLOduration=2.521233301 podStartE2EDuration="2.955001046s" podCreationTimestamp="2026-03-01 09:52:23 +0000 UTC" firstStartedPulling="2026-03-01 09:52:24.691662615 +0000 UTC m=+2673.933541812" lastFinishedPulling="2026-03-01 09:52:25.12543036 +0000 UTC m=+2674.367309557" observedRunningTime="2026-03-01 09:52:25.949346956 +0000 UTC m=+2675.191226153" watchObservedRunningTime="2026-03-01 09:52:25.955001046 +0000 UTC m=+2675.196880243" Mar 01 09:52:27 crc kubenswrapper[4792]: I0301 09:52:27.408946 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:52:27 crc kubenswrapper[4792]: E0301 09:52:27.409433 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:52:31 crc kubenswrapper[4792]: I0301 09:52:31.315796 4792 scope.go:117] "RemoveContainer" containerID="739afaccd13faa05c6c15e2c6b70ac689c35aa13a309f0b869c97a20dddff65e" Mar 01 09:52:40 crc kubenswrapper[4792]: I0301 09:52:40.409299 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:52:40 crc kubenswrapper[4792]: E0301 09:52:40.410245 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:52:53 crc kubenswrapper[4792]: I0301 09:52:53.409996 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:52:53 crc kubenswrapper[4792]: E0301 09:52:53.411048 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:53:06 crc kubenswrapper[4792]: I0301 09:53:06.408817 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:53:06 crc kubenswrapper[4792]: E0301 09:53:06.409464 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:53:19 crc kubenswrapper[4792]: I0301 09:53:19.409712 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:53:19 crc kubenswrapper[4792]: E0301 09:53:19.410568 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:53:26 crc kubenswrapper[4792]: I0301 09:53:26.421058 4792 generic.go:334] "Generic (PLEG): container finished" podID="f737af00-5e6f-4a95-bf94-738b72990ebd" containerID="b89dd5932c826da6e3e27b6cf0817bc475f510c4beb76bbcb78cfad314ff427c" exitCode=0 Mar 01 09:53:26 crc kubenswrapper[4792]: I0301 09:53:26.421332 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" event={"ID":"f737af00-5e6f-4a95-bf94-738b72990ebd","Type":"ContainerDied","Data":"b89dd5932c826da6e3e27b6cf0817bc475f510c4beb76bbcb78cfad314ff427c"} Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.809832 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.849257 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-neutron-ovn-metadata-agent-neutron-config-0\") pod \"f737af00-5e6f-4a95-bf94-738b72990ebd\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.849312 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-inventory\") pod \"f737af00-5e6f-4a95-bf94-738b72990ebd\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.849337 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cr4f2\" (UniqueName: \"kubernetes.io/projected/f737af00-5e6f-4a95-bf94-738b72990ebd-kube-api-access-cr4f2\") pod \"f737af00-5e6f-4a95-bf94-738b72990ebd\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.849397 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-ceph\") pod \"f737af00-5e6f-4a95-bf94-738b72990ebd\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.850242 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-ssh-key-openstack-edpm-ipam\") pod \"f737af00-5e6f-4a95-bf94-738b72990ebd\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.850457 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-nova-metadata-neutron-config-0\") pod \"f737af00-5e6f-4a95-bf94-738b72990ebd\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.850513 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-neutron-metadata-combined-ca-bundle\") pod \"f737af00-5e6f-4a95-bf94-738b72990ebd\" (UID: \"f737af00-5e6f-4a95-bf94-738b72990ebd\") " Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.858952 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f737af00-5e6f-4a95-bf94-738b72990ebd-kube-api-access-cr4f2" (OuterVolumeSpecName: "kube-api-access-cr4f2") pod "f737af00-5e6f-4a95-bf94-738b72990ebd" (UID: "f737af00-5e6f-4a95-bf94-738b72990ebd"). InnerVolumeSpecName "kube-api-access-cr4f2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.859106 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "f737af00-5e6f-4a95-bf94-738b72990ebd" (UID: "f737af00-5e6f-4a95-bf94-738b72990ebd"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.859155 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-ceph" (OuterVolumeSpecName: "ceph") pod "f737af00-5e6f-4a95-bf94-738b72990ebd" (UID: "f737af00-5e6f-4a95-bf94-738b72990ebd"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.884041 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "f737af00-5e6f-4a95-bf94-738b72990ebd" (UID: "f737af00-5e6f-4a95-bf94-738b72990ebd"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.886335 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-inventory" (OuterVolumeSpecName: "inventory") pod "f737af00-5e6f-4a95-bf94-738b72990ebd" (UID: "f737af00-5e6f-4a95-bf94-738b72990ebd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.889765 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "f737af00-5e6f-4a95-bf94-738b72990ebd" (UID: "f737af00-5e6f-4a95-bf94-738b72990ebd"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.889829 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f737af00-5e6f-4a95-bf94-738b72990ebd" (UID: "f737af00-5e6f-4a95-bf94-738b72990ebd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.952996 4792 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.953173 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.953247 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cr4f2\" (UniqueName: \"kubernetes.io/projected/f737af00-5e6f-4a95-bf94-738b72990ebd-kube-api-access-cr4f2\") on node \"crc\" DevicePath \"\"" Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.953304 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.953355 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.953405 4792 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 01 09:53:27 crc kubenswrapper[4792]: I0301 09:53:27.953458 4792 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f737af00-5e6f-4a95-bf94-738b72990ebd-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.448791 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" event={"ID":"f737af00-5e6f-4a95-bf94-738b72990ebd","Type":"ContainerDied","Data":"87d887549298c09ffe2d57352b333fabf0848f6fb2f69c640fc971765e3e3259"} Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.448828 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87d887549298c09ffe2d57352b333fabf0848f6fb2f69c640fc971765e3e3259" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.449417 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.538902 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt"] Mar 01 09:53:28 crc kubenswrapper[4792]: E0301 09:53:28.539368 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f737af00-5e6f-4a95-bf94-738b72990ebd" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.539395 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f737af00-5e6f-4a95-bf94-738b72990ebd" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.539664 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f737af00-5e6f-4a95-bf94-738b72990ebd" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.540486 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.545079 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.545814 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.545972 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.546430 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.546555 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.546670 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.551158 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt"] Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.563743 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thspr\" (UniqueName: \"kubernetes.io/projected/c7230f65-7e9a-4455-8d25-c49393bfbafe-kube-api-access-thspr\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.563786 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.563819 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.563837 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.563891 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.563967 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.665247 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.665458 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thspr\" (UniqueName: \"kubernetes.io/projected/c7230f65-7e9a-4455-8d25-c49393bfbafe-kube-api-access-thspr\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.665562 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.665647 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.665719 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.665827 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.670122 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.671236 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.671996 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.672539 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.672817 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.688614 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thspr\" (UniqueName: \"kubernetes.io/projected/c7230f65-7e9a-4455-8d25-c49393bfbafe-kube-api-access-thspr\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:28 crc kubenswrapper[4792]: I0301 09:53:28.858544 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:53:29 crc kubenswrapper[4792]: I0301 09:53:29.370611 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt"] Mar 01 09:53:29 crc kubenswrapper[4792]: W0301 09:53:29.383645 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7230f65_7e9a_4455_8d25_c49393bfbafe.slice/crio-49739f0dd600f78c2fce8ea6d1d733fa8e1503a99144161a871f4dba20c10897 WatchSource:0}: Error finding container 49739f0dd600f78c2fce8ea6d1d733fa8e1503a99144161a871f4dba20c10897: Status 404 returned error can't find the container with id 49739f0dd600f78c2fce8ea6d1d733fa8e1503a99144161a871f4dba20c10897 Mar 01 09:53:29 crc kubenswrapper[4792]: I0301 09:53:29.457867 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" event={"ID":"c7230f65-7e9a-4455-8d25-c49393bfbafe","Type":"ContainerStarted","Data":"49739f0dd600f78c2fce8ea6d1d733fa8e1503a99144161a871f4dba20c10897"} Mar 01 09:53:30 crc kubenswrapper[4792]: I0301 09:53:30.466434 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" event={"ID":"c7230f65-7e9a-4455-8d25-c49393bfbafe","Type":"ContainerStarted","Data":"e854573aa1d54c447e18219253a483ccbce7dfebd37e9e5c1c0e176ad1346674"} Mar 01 09:53:30 crc kubenswrapper[4792]: I0301 09:53:30.484113 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" podStartSLOduration=2.002514313 podStartE2EDuration="2.484090492s" podCreationTimestamp="2026-03-01 09:53:28 +0000 UTC" firstStartedPulling="2026-03-01 09:53:29.38647308 +0000 UTC m=+2738.628352277" lastFinishedPulling="2026-03-01 09:53:29.868049259 +0000 UTC m=+2739.109928456" observedRunningTime="2026-03-01 09:53:30.4795671 +0000 UTC m=+2739.721446297" watchObservedRunningTime="2026-03-01 09:53:30.484090492 +0000 UTC m=+2739.725969689" Mar 01 09:53:33 crc kubenswrapper[4792]: I0301 09:53:33.408997 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:53:33 crc kubenswrapper[4792]: E0301 09:53:33.409499 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:53:48 crc kubenswrapper[4792]: I0301 09:53:48.408476 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:53:48 crc kubenswrapper[4792]: E0301 09:53:48.409257 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:54:00 crc kubenswrapper[4792]: I0301 09:54:00.151334 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539314-9brl7"] Mar 01 09:54:00 crc kubenswrapper[4792]: I0301 09:54:00.158311 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539314-9brl7" Mar 01 09:54:00 crc kubenswrapper[4792]: I0301 09:54:00.163252 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:54:00 crc kubenswrapper[4792]: I0301 09:54:00.163252 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:54:00 crc kubenswrapper[4792]: I0301 09:54:00.167162 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:54:00 crc kubenswrapper[4792]: I0301 09:54:00.186545 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539314-9brl7"] Mar 01 09:54:00 crc kubenswrapper[4792]: I0301 09:54:00.252284 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzcbw\" (UniqueName: \"kubernetes.io/projected/d633fb4c-b1e3-463f-af0a-2891b7130fc0-kube-api-access-bzcbw\") pod \"auto-csr-approver-29539314-9brl7\" (UID: \"d633fb4c-b1e3-463f-af0a-2891b7130fc0\") " pod="openshift-infra/auto-csr-approver-29539314-9brl7" Mar 01 09:54:00 crc kubenswrapper[4792]: I0301 09:54:00.353498 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzcbw\" (UniqueName: \"kubernetes.io/projected/d633fb4c-b1e3-463f-af0a-2891b7130fc0-kube-api-access-bzcbw\") pod \"auto-csr-approver-29539314-9brl7\" (UID: \"d633fb4c-b1e3-463f-af0a-2891b7130fc0\") " pod="openshift-infra/auto-csr-approver-29539314-9brl7" Mar 01 09:54:00 crc kubenswrapper[4792]: I0301 09:54:00.383648 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzcbw\" (UniqueName: \"kubernetes.io/projected/d633fb4c-b1e3-463f-af0a-2891b7130fc0-kube-api-access-bzcbw\") pod \"auto-csr-approver-29539314-9brl7\" (UID: \"d633fb4c-b1e3-463f-af0a-2891b7130fc0\") " pod="openshift-infra/auto-csr-approver-29539314-9brl7" Mar 01 09:54:00 crc kubenswrapper[4792]: I0301 09:54:00.409301 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:54:00 crc kubenswrapper[4792]: E0301 09:54:00.409947 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:54:00 crc kubenswrapper[4792]: I0301 09:54:00.487654 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539314-9brl7" Mar 01 09:54:00 crc kubenswrapper[4792]: I0301 09:54:00.955564 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539314-9brl7"] Mar 01 09:54:01 crc kubenswrapper[4792]: I0301 09:54:01.743401 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539314-9brl7" event={"ID":"d633fb4c-b1e3-463f-af0a-2891b7130fc0","Type":"ContainerStarted","Data":"81081e6aa3da0609653570cbd97f437d3c30358ab7d1ccfd226d44ca883e5aef"} Mar 01 09:54:02 crc kubenswrapper[4792]: I0301 09:54:02.751772 4792 generic.go:334] "Generic (PLEG): container finished" podID="d633fb4c-b1e3-463f-af0a-2891b7130fc0" containerID="19ca062443b337d8791859ab02de766e48126cb99d1f720dcaa520cb4be8f904" exitCode=0 Mar 01 09:54:02 crc kubenswrapper[4792]: I0301 09:54:02.751865 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539314-9brl7" event={"ID":"d633fb4c-b1e3-463f-af0a-2891b7130fc0","Type":"ContainerDied","Data":"19ca062443b337d8791859ab02de766e48126cb99d1f720dcaa520cb4be8f904"} Mar 01 09:54:04 crc kubenswrapper[4792]: I0301 09:54:04.050235 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539314-9brl7" Mar 01 09:54:04 crc kubenswrapper[4792]: I0301 09:54:04.137390 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzcbw\" (UniqueName: \"kubernetes.io/projected/d633fb4c-b1e3-463f-af0a-2891b7130fc0-kube-api-access-bzcbw\") pod \"d633fb4c-b1e3-463f-af0a-2891b7130fc0\" (UID: \"d633fb4c-b1e3-463f-af0a-2891b7130fc0\") " Mar 01 09:54:04 crc kubenswrapper[4792]: I0301 09:54:04.155128 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d633fb4c-b1e3-463f-af0a-2891b7130fc0-kube-api-access-bzcbw" (OuterVolumeSpecName: "kube-api-access-bzcbw") pod "d633fb4c-b1e3-463f-af0a-2891b7130fc0" (UID: "d633fb4c-b1e3-463f-af0a-2891b7130fc0"). InnerVolumeSpecName "kube-api-access-bzcbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:54:04 crc kubenswrapper[4792]: I0301 09:54:04.239799 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzcbw\" (UniqueName: \"kubernetes.io/projected/d633fb4c-b1e3-463f-af0a-2891b7130fc0-kube-api-access-bzcbw\") on node \"crc\" DevicePath \"\"" Mar 01 09:54:04 crc kubenswrapper[4792]: I0301 09:54:04.769071 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539314-9brl7" event={"ID":"d633fb4c-b1e3-463f-af0a-2891b7130fc0","Type":"ContainerDied","Data":"81081e6aa3da0609653570cbd97f437d3c30358ab7d1ccfd226d44ca883e5aef"} Mar 01 09:54:04 crc kubenswrapper[4792]: I0301 09:54:04.769115 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81081e6aa3da0609653570cbd97f437d3c30358ab7d1ccfd226d44ca883e5aef" Mar 01 09:54:04 crc kubenswrapper[4792]: I0301 09:54:04.769132 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539314-9brl7" Mar 01 09:54:05 crc kubenswrapper[4792]: I0301 09:54:05.124550 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539308-lnc4n"] Mar 01 09:54:05 crc kubenswrapper[4792]: I0301 09:54:05.138559 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539308-lnc4n"] Mar 01 09:54:05 crc kubenswrapper[4792]: I0301 09:54:05.420365 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e3f198d-a642-45b3-9a5a-fd5906670db8" path="/var/lib/kubelet/pods/1e3f198d-a642-45b3-9a5a-fd5906670db8/volumes" Mar 01 09:54:11 crc kubenswrapper[4792]: I0301 09:54:11.415230 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:54:11 crc kubenswrapper[4792]: E0301 09:54:11.416286 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:54:22 crc kubenswrapper[4792]: I0301 09:54:22.408520 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:54:22 crc kubenswrapper[4792]: E0301 09:54:22.409527 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:54:31 crc kubenswrapper[4792]: I0301 09:54:31.430669 4792 scope.go:117] "RemoveContainer" containerID="cde0b22712c7c2f1430743fdebf0e1e49438b47b056e66c49fd78cf546ba54f9" Mar 01 09:54:33 crc kubenswrapper[4792]: I0301 09:54:33.409020 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:54:33 crc kubenswrapper[4792]: E0301 09:54:33.409563 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 09:54:34 crc kubenswrapper[4792]: I0301 09:54:34.176369 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c2xpd"] Mar 01 09:54:34 crc kubenswrapper[4792]: E0301 09:54:34.176725 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d633fb4c-b1e3-463f-af0a-2891b7130fc0" containerName="oc" Mar 01 09:54:34 crc kubenswrapper[4792]: I0301 09:54:34.176743 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d633fb4c-b1e3-463f-af0a-2891b7130fc0" containerName="oc" Mar 01 09:54:34 crc kubenswrapper[4792]: I0301 09:54:34.176942 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d633fb4c-b1e3-463f-af0a-2891b7130fc0" containerName="oc" Mar 01 09:54:34 crc kubenswrapper[4792]: I0301 09:54:34.178400 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c2xpd" Mar 01 09:54:34 crc kubenswrapper[4792]: I0301 09:54:34.191027 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c2xpd"] Mar 01 09:54:34 crc kubenswrapper[4792]: I0301 09:54:34.292800 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc2cn\" (UniqueName: \"kubernetes.io/projected/18317120-5fe3-415e-9646-44ec3a528eb7-kube-api-access-zc2cn\") pod \"redhat-operators-c2xpd\" (UID: \"18317120-5fe3-415e-9646-44ec3a528eb7\") " pod="openshift-marketplace/redhat-operators-c2xpd" Mar 01 09:54:34 crc kubenswrapper[4792]: I0301 09:54:34.292898 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18317120-5fe3-415e-9646-44ec3a528eb7-utilities\") pod \"redhat-operators-c2xpd\" (UID: \"18317120-5fe3-415e-9646-44ec3a528eb7\") " pod="openshift-marketplace/redhat-operators-c2xpd" Mar 01 09:54:34 crc kubenswrapper[4792]: I0301 09:54:34.292946 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18317120-5fe3-415e-9646-44ec3a528eb7-catalog-content\") pod \"redhat-operators-c2xpd\" (UID: \"18317120-5fe3-415e-9646-44ec3a528eb7\") " pod="openshift-marketplace/redhat-operators-c2xpd" Mar 01 09:54:34 crc kubenswrapper[4792]: I0301 09:54:34.394595 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18317120-5fe3-415e-9646-44ec3a528eb7-catalog-content\") pod \"redhat-operators-c2xpd\" (UID: \"18317120-5fe3-415e-9646-44ec3a528eb7\") " pod="openshift-marketplace/redhat-operators-c2xpd" Mar 01 09:54:34 crc kubenswrapper[4792]: I0301 09:54:34.394982 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc2cn\" (UniqueName: \"kubernetes.io/projected/18317120-5fe3-415e-9646-44ec3a528eb7-kube-api-access-zc2cn\") pod \"redhat-operators-c2xpd\" (UID: \"18317120-5fe3-415e-9646-44ec3a528eb7\") " pod="openshift-marketplace/redhat-operators-c2xpd" Mar 01 09:54:34 crc kubenswrapper[4792]: I0301 09:54:34.395055 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18317120-5fe3-415e-9646-44ec3a528eb7-utilities\") pod \"redhat-operators-c2xpd\" (UID: \"18317120-5fe3-415e-9646-44ec3a528eb7\") " pod="openshift-marketplace/redhat-operators-c2xpd" Mar 01 09:54:34 crc kubenswrapper[4792]: I0301 09:54:34.395177 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18317120-5fe3-415e-9646-44ec3a528eb7-catalog-content\") pod \"redhat-operators-c2xpd\" (UID: \"18317120-5fe3-415e-9646-44ec3a528eb7\") " pod="openshift-marketplace/redhat-operators-c2xpd" Mar 01 09:54:34 crc kubenswrapper[4792]: I0301 09:54:34.395305 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18317120-5fe3-415e-9646-44ec3a528eb7-utilities\") pod \"redhat-operators-c2xpd\" (UID: \"18317120-5fe3-415e-9646-44ec3a528eb7\") " pod="openshift-marketplace/redhat-operators-c2xpd" Mar 01 09:54:34 crc kubenswrapper[4792]: I0301 09:54:34.416707 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc2cn\" (UniqueName: \"kubernetes.io/projected/18317120-5fe3-415e-9646-44ec3a528eb7-kube-api-access-zc2cn\") pod \"redhat-operators-c2xpd\" (UID: \"18317120-5fe3-415e-9646-44ec3a528eb7\") " pod="openshift-marketplace/redhat-operators-c2xpd" Mar 01 09:54:34 crc kubenswrapper[4792]: I0301 09:54:34.511850 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c2xpd" Mar 01 09:54:35 crc kubenswrapper[4792]: I0301 09:54:35.000928 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c2xpd"] Mar 01 09:54:35 crc kubenswrapper[4792]: I0301 09:54:35.020881 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2xpd" event={"ID":"18317120-5fe3-415e-9646-44ec3a528eb7","Type":"ContainerStarted","Data":"ee907b4b929868643143bf7639b65b69ce3a2888c7d73c8cbf27659d5681a348"} Mar 01 09:54:36 crc kubenswrapper[4792]: I0301 09:54:36.031712 4792 generic.go:334] "Generic (PLEG): container finished" podID="18317120-5fe3-415e-9646-44ec3a528eb7" containerID="b2f9d0e40b0a2b67e86560f5ccd1a86dd3a00bd2b6f96ef056d30c86017d4ef8" exitCode=0 Mar 01 09:54:36 crc kubenswrapper[4792]: I0301 09:54:36.031779 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2xpd" event={"ID":"18317120-5fe3-415e-9646-44ec3a528eb7","Type":"ContainerDied","Data":"b2f9d0e40b0a2b67e86560f5ccd1a86dd3a00bd2b6f96ef056d30c86017d4ef8"} Mar 01 09:54:37 crc kubenswrapper[4792]: I0301 09:54:37.046955 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2xpd" event={"ID":"18317120-5fe3-415e-9646-44ec3a528eb7","Type":"ContainerStarted","Data":"86395a2da12a91c13bfefd6b1284cb572b9ad1954fa1f434ffd1bbad0b6f04d0"} Mar 01 09:54:42 crc kubenswrapper[4792]: I0301 09:54:42.086296 4792 generic.go:334] "Generic (PLEG): container finished" podID="18317120-5fe3-415e-9646-44ec3a528eb7" containerID="86395a2da12a91c13bfefd6b1284cb572b9ad1954fa1f434ffd1bbad0b6f04d0" exitCode=0 Mar 01 09:54:42 crc kubenswrapper[4792]: I0301 09:54:42.086384 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2xpd" event={"ID":"18317120-5fe3-415e-9646-44ec3a528eb7","Type":"ContainerDied","Data":"86395a2da12a91c13bfefd6b1284cb572b9ad1954fa1f434ffd1bbad0b6f04d0"} Mar 01 09:54:43 crc kubenswrapper[4792]: I0301 09:54:43.107026 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2xpd" event={"ID":"18317120-5fe3-415e-9646-44ec3a528eb7","Type":"ContainerStarted","Data":"32b275015fe6c36ca8db45d58d20e68bddc88bd9f4816385a5f919ea21c9fa74"} Mar 01 09:54:43 crc kubenswrapper[4792]: I0301 09:54:43.127045 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c2xpd" podStartSLOduration=2.69614874 podStartE2EDuration="9.127017224s" podCreationTimestamp="2026-03-01 09:54:34 +0000 UTC" firstStartedPulling="2026-03-01 09:54:36.034064606 +0000 UTC m=+2805.275943803" lastFinishedPulling="2026-03-01 09:54:42.46493308 +0000 UTC m=+2811.706812287" observedRunningTime="2026-03-01 09:54:43.124550583 +0000 UTC m=+2812.366429790" watchObservedRunningTime="2026-03-01 09:54:43.127017224 +0000 UTC m=+2812.368896451" Mar 01 09:54:44 crc kubenswrapper[4792]: I0301 09:54:44.513149 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c2xpd" Mar 01 09:54:44 crc kubenswrapper[4792]: I0301 09:54:44.514110 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c2xpd" Mar 01 09:54:45 crc kubenswrapper[4792]: I0301 09:54:45.410956 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:54:45 crc kubenswrapper[4792]: I0301 09:54:45.558689 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-c2xpd" podUID="18317120-5fe3-415e-9646-44ec3a528eb7" containerName="registry-server" probeResult="failure" output=< Mar 01 09:54:45 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 09:54:45 crc kubenswrapper[4792]: > Mar 01 09:54:46 crc kubenswrapper[4792]: I0301 09:54:46.139247 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"0c8bbdbe3553a3ed4617103bbd379245337bfad8f18870faba13af9b3c14caa1"} Mar 01 09:54:54 crc kubenswrapper[4792]: I0301 09:54:54.552571 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c2xpd" Mar 01 09:54:54 crc kubenswrapper[4792]: I0301 09:54:54.597291 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c2xpd" Mar 01 09:54:54 crc kubenswrapper[4792]: I0301 09:54:54.786810 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c2xpd"] Mar 01 09:54:56 crc kubenswrapper[4792]: I0301 09:54:56.226063 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-c2xpd" podUID="18317120-5fe3-415e-9646-44ec3a528eb7" containerName="registry-server" containerID="cri-o://32b275015fe6c36ca8db45d58d20e68bddc88bd9f4816385a5f919ea21c9fa74" gracePeriod=2 Mar 01 09:54:56 crc kubenswrapper[4792]: I0301 09:54:56.695542 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c2xpd" Mar 01 09:54:56 crc kubenswrapper[4792]: I0301 09:54:56.727507 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18317120-5fe3-415e-9646-44ec3a528eb7-catalog-content\") pod \"18317120-5fe3-415e-9646-44ec3a528eb7\" (UID: \"18317120-5fe3-415e-9646-44ec3a528eb7\") " Mar 01 09:54:56 crc kubenswrapper[4792]: I0301 09:54:56.727624 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18317120-5fe3-415e-9646-44ec3a528eb7-utilities\") pod \"18317120-5fe3-415e-9646-44ec3a528eb7\" (UID: \"18317120-5fe3-415e-9646-44ec3a528eb7\") " Mar 01 09:54:56 crc kubenswrapper[4792]: I0301 09:54:56.727656 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zc2cn\" (UniqueName: \"kubernetes.io/projected/18317120-5fe3-415e-9646-44ec3a528eb7-kube-api-access-zc2cn\") pod \"18317120-5fe3-415e-9646-44ec3a528eb7\" (UID: \"18317120-5fe3-415e-9646-44ec3a528eb7\") " Mar 01 09:54:56 crc kubenswrapper[4792]: I0301 09:54:56.728490 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18317120-5fe3-415e-9646-44ec3a528eb7-utilities" (OuterVolumeSpecName: "utilities") pod "18317120-5fe3-415e-9646-44ec3a528eb7" (UID: "18317120-5fe3-415e-9646-44ec3a528eb7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:54:56 crc kubenswrapper[4792]: I0301 09:54:56.734189 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18317120-5fe3-415e-9646-44ec3a528eb7-kube-api-access-zc2cn" (OuterVolumeSpecName: "kube-api-access-zc2cn") pod "18317120-5fe3-415e-9646-44ec3a528eb7" (UID: "18317120-5fe3-415e-9646-44ec3a528eb7"). InnerVolumeSpecName "kube-api-access-zc2cn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:54:56 crc kubenswrapper[4792]: I0301 09:54:56.830488 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18317120-5fe3-415e-9646-44ec3a528eb7-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:54:56 crc kubenswrapper[4792]: I0301 09:54:56.830525 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zc2cn\" (UniqueName: \"kubernetes.io/projected/18317120-5fe3-415e-9646-44ec3a528eb7-kube-api-access-zc2cn\") on node \"crc\" DevicePath \"\"" Mar 01 09:54:56 crc kubenswrapper[4792]: I0301 09:54:56.852649 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18317120-5fe3-415e-9646-44ec3a528eb7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "18317120-5fe3-415e-9646-44ec3a528eb7" (UID: "18317120-5fe3-415e-9646-44ec3a528eb7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:54:56 crc kubenswrapper[4792]: I0301 09:54:56.931427 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18317120-5fe3-415e-9646-44ec3a528eb7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:54:57 crc kubenswrapper[4792]: I0301 09:54:57.236594 4792 generic.go:334] "Generic (PLEG): container finished" podID="18317120-5fe3-415e-9646-44ec3a528eb7" containerID="32b275015fe6c36ca8db45d58d20e68bddc88bd9f4816385a5f919ea21c9fa74" exitCode=0 Mar 01 09:54:57 crc kubenswrapper[4792]: I0301 09:54:57.236640 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c2xpd" Mar 01 09:54:57 crc kubenswrapper[4792]: I0301 09:54:57.236658 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2xpd" event={"ID":"18317120-5fe3-415e-9646-44ec3a528eb7","Type":"ContainerDied","Data":"32b275015fe6c36ca8db45d58d20e68bddc88bd9f4816385a5f919ea21c9fa74"} Mar 01 09:54:57 crc kubenswrapper[4792]: I0301 09:54:57.237735 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2xpd" event={"ID":"18317120-5fe3-415e-9646-44ec3a528eb7","Type":"ContainerDied","Data":"ee907b4b929868643143bf7639b65b69ce3a2888c7d73c8cbf27659d5681a348"} Mar 01 09:54:57 crc kubenswrapper[4792]: I0301 09:54:57.237756 4792 scope.go:117] "RemoveContainer" containerID="32b275015fe6c36ca8db45d58d20e68bddc88bd9f4816385a5f919ea21c9fa74" Mar 01 09:54:57 crc kubenswrapper[4792]: I0301 09:54:57.267852 4792 scope.go:117] "RemoveContainer" containerID="86395a2da12a91c13bfefd6b1284cb572b9ad1954fa1f434ffd1bbad0b6f04d0" Mar 01 09:54:57 crc kubenswrapper[4792]: I0301 09:54:57.278444 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c2xpd"] Mar 01 09:54:57 crc kubenswrapper[4792]: I0301 09:54:57.297947 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-c2xpd"] Mar 01 09:54:57 crc kubenswrapper[4792]: I0301 09:54:57.301205 4792 scope.go:117] "RemoveContainer" containerID="b2f9d0e40b0a2b67e86560f5ccd1a86dd3a00bd2b6f96ef056d30c86017d4ef8" Mar 01 09:54:57 crc kubenswrapper[4792]: I0301 09:54:57.346802 4792 scope.go:117] "RemoveContainer" containerID="32b275015fe6c36ca8db45d58d20e68bddc88bd9f4816385a5f919ea21c9fa74" Mar 01 09:54:57 crc kubenswrapper[4792]: E0301 09:54:57.347686 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32b275015fe6c36ca8db45d58d20e68bddc88bd9f4816385a5f919ea21c9fa74\": container with ID starting with 32b275015fe6c36ca8db45d58d20e68bddc88bd9f4816385a5f919ea21c9fa74 not found: ID does not exist" containerID="32b275015fe6c36ca8db45d58d20e68bddc88bd9f4816385a5f919ea21c9fa74" Mar 01 09:54:57 crc kubenswrapper[4792]: I0301 09:54:57.347728 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32b275015fe6c36ca8db45d58d20e68bddc88bd9f4816385a5f919ea21c9fa74"} err="failed to get container status \"32b275015fe6c36ca8db45d58d20e68bddc88bd9f4816385a5f919ea21c9fa74\": rpc error: code = NotFound desc = could not find container \"32b275015fe6c36ca8db45d58d20e68bddc88bd9f4816385a5f919ea21c9fa74\": container with ID starting with 32b275015fe6c36ca8db45d58d20e68bddc88bd9f4816385a5f919ea21c9fa74 not found: ID does not exist" Mar 01 09:54:57 crc kubenswrapper[4792]: I0301 09:54:57.347806 4792 scope.go:117] "RemoveContainer" containerID="86395a2da12a91c13bfefd6b1284cb572b9ad1954fa1f434ffd1bbad0b6f04d0" Mar 01 09:54:57 crc kubenswrapper[4792]: E0301 09:54:57.348646 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86395a2da12a91c13bfefd6b1284cb572b9ad1954fa1f434ffd1bbad0b6f04d0\": container with ID starting with 86395a2da12a91c13bfefd6b1284cb572b9ad1954fa1f434ffd1bbad0b6f04d0 not found: ID does not exist" containerID="86395a2da12a91c13bfefd6b1284cb572b9ad1954fa1f434ffd1bbad0b6f04d0" Mar 01 09:54:57 crc kubenswrapper[4792]: I0301 09:54:57.348674 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86395a2da12a91c13bfefd6b1284cb572b9ad1954fa1f434ffd1bbad0b6f04d0"} err="failed to get container status \"86395a2da12a91c13bfefd6b1284cb572b9ad1954fa1f434ffd1bbad0b6f04d0\": rpc error: code = NotFound desc = could not find container \"86395a2da12a91c13bfefd6b1284cb572b9ad1954fa1f434ffd1bbad0b6f04d0\": container with ID starting with 86395a2da12a91c13bfefd6b1284cb572b9ad1954fa1f434ffd1bbad0b6f04d0 not found: ID does not exist" Mar 01 09:54:57 crc kubenswrapper[4792]: I0301 09:54:57.348687 4792 scope.go:117] "RemoveContainer" containerID="b2f9d0e40b0a2b67e86560f5ccd1a86dd3a00bd2b6f96ef056d30c86017d4ef8" Mar 01 09:54:57 crc kubenswrapper[4792]: E0301 09:54:57.348917 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2f9d0e40b0a2b67e86560f5ccd1a86dd3a00bd2b6f96ef056d30c86017d4ef8\": container with ID starting with b2f9d0e40b0a2b67e86560f5ccd1a86dd3a00bd2b6f96ef056d30c86017d4ef8 not found: ID does not exist" containerID="b2f9d0e40b0a2b67e86560f5ccd1a86dd3a00bd2b6f96ef056d30c86017d4ef8" Mar 01 09:54:57 crc kubenswrapper[4792]: I0301 09:54:57.348936 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2f9d0e40b0a2b67e86560f5ccd1a86dd3a00bd2b6f96ef056d30c86017d4ef8"} err="failed to get container status \"b2f9d0e40b0a2b67e86560f5ccd1a86dd3a00bd2b6f96ef056d30c86017d4ef8\": rpc error: code = NotFound desc = could not find container \"b2f9d0e40b0a2b67e86560f5ccd1a86dd3a00bd2b6f96ef056d30c86017d4ef8\": container with ID starting with b2f9d0e40b0a2b67e86560f5ccd1a86dd3a00bd2b6f96ef056d30c86017d4ef8 not found: ID does not exist" Mar 01 09:54:57 crc kubenswrapper[4792]: I0301 09:54:57.419389 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18317120-5fe3-415e-9646-44ec3a528eb7" path="/var/lib/kubelet/pods/18317120-5fe3-415e-9646-44ec3a528eb7/volumes" Mar 01 09:56:00 crc kubenswrapper[4792]: I0301 09:56:00.142043 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539316-8swf8"] Mar 01 09:56:00 crc kubenswrapper[4792]: E0301 09:56:00.142949 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18317120-5fe3-415e-9646-44ec3a528eb7" containerName="registry-server" Mar 01 09:56:00 crc kubenswrapper[4792]: I0301 09:56:00.142965 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="18317120-5fe3-415e-9646-44ec3a528eb7" containerName="registry-server" Mar 01 09:56:00 crc kubenswrapper[4792]: E0301 09:56:00.142981 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18317120-5fe3-415e-9646-44ec3a528eb7" containerName="extract-utilities" Mar 01 09:56:00 crc kubenswrapper[4792]: I0301 09:56:00.142989 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="18317120-5fe3-415e-9646-44ec3a528eb7" containerName="extract-utilities" Mar 01 09:56:00 crc kubenswrapper[4792]: E0301 09:56:00.143017 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18317120-5fe3-415e-9646-44ec3a528eb7" containerName="extract-content" Mar 01 09:56:00 crc kubenswrapper[4792]: I0301 09:56:00.143026 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="18317120-5fe3-415e-9646-44ec3a528eb7" containerName="extract-content" Mar 01 09:56:00 crc kubenswrapper[4792]: I0301 09:56:00.143199 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="18317120-5fe3-415e-9646-44ec3a528eb7" containerName="registry-server" Mar 01 09:56:00 crc kubenswrapper[4792]: I0301 09:56:00.143743 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539316-8swf8" Mar 01 09:56:00 crc kubenswrapper[4792]: I0301 09:56:00.145971 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:56:00 crc kubenswrapper[4792]: I0301 09:56:00.146010 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:56:00 crc kubenswrapper[4792]: I0301 09:56:00.146317 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:56:00 crc kubenswrapper[4792]: I0301 09:56:00.149464 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfcjg\" (UniqueName: \"kubernetes.io/projected/e9273c82-13c3-43c5-b90e-16fdb09f082e-kube-api-access-qfcjg\") pod \"auto-csr-approver-29539316-8swf8\" (UID: \"e9273c82-13c3-43c5-b90e-16fdb09f082e\") " pod="openshift-infra/auto-csr-approver-29539316-8swf8" Mar 01 09:56:00 crc kubenswrapper[4792]: I0301 09:56:00.160534 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539316-8swf8"] Mar 01 09:56:00 crc kubenswrapper[4792]: I0301 09:56:00.251388 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfcjg\" (UniqueName: \"kubernetes.io/projected/e9273c82-13c3-43c5-b90e-16fdb09f082e-kube-api-access-qfcjg\") pod \"auto-csr-approver-29539316-8swf8\" (UID: \"e9273c82-13c3-43c5-b90e-16fdb09f082e\") " pod="openshift-infra/auto-csr-approver-29539316-8swf8" Mar 01 09:56:00 crc kubenswrapper[4792]: I0301 09:56:00.269465 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfcjg\" (UniqueName: \"kubernetes.io/projected/e9273c82-13c3-43c5-b90e-16fdb09f082e-kube-api-access-qfcjg\") pod \"auto-csr-approver-29539316-8swf8\" (UID: \"e9273c82-13c3-43c5-b90e-16fdb09f082e\") " pod="openshift-infra/auto-csr-approver-29539316-8swf8" Mar 01 09:56:00 crc kubenswrapper[4792]: I0301 09:56:00.461065 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539316-8swf8" Mar 01 09:56:00 crc kubenswrapper[4792]: I0301 09:56:00.900949 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539316-8swf8"] Mar 01 09:56:00 crc kubenswrapper[4792]: W0301 09:56:00.903582 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9273c82_13c3_43c5_b90e_16fdb09f082e.slice/crio-1dad835c77013b58809a520c2651be9bac6b64755244aae7aa6a7ab543562843 WatchSource:0}: Error finding container 1dad835c77013b58809a520c2651be9bac6b64755244aae7aa6a7ab543562843: Status 404 returned error can't find the container with id 1dad835c77013b58809a520c2651be9bac6b64755244aae7aa6a7ab543562843 Mar 01 09:56:00 crc kubenswrapper[4792]: I0301 09:56:00.905697 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 01 09:56:01 crc kubenswrapper[4792]: I0301 09:56:01.801884 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539316-8swf8" event={"ID":"e9273c82-13c3-43c5-b90e-16fdb09f082e","Type":"ContainerStarted","Data":"1dad835c77013b58809a520c2651be9bac6b64755244aae7aa6a7ab543562843"} Mar 01 09:56:02 crc kubenswrapper[4792]: I0301 09:56:02.810833 4792 generic.go:334] "Generic (PLEG): container finished" podID="e9273c82-13c3-43c5-b90e-16fdb09f082e" containerID="3aeb9f44a1b454186ac24af4b6119b2ea036267663153223c161c28c89a3a926" exitCode=0 Mar 01 09:56:02 crc kubenswrapper[4792]: I0301 09:56:02.810930 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539316-8swf8" event={"ID":"e9273c82-13c3-43c5-b90e-16fdb09f082e","Type":"ContainerDied","Data":"3aeb9f44a1b454186ac24af4b6119b2ea036267663153223c161c28c89a3a926"} Mar 01 09:56:04 crc kubenswrapper[4792]: I0301 09:56:04.133193 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539316-8swf8" Mar 01 09:56:04 crc kubenswrapper[4792]: I0301 09:56:04.318203 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfcjg\" (UniqueName: \"kubernetes.io/projected/e9273c82-13c3-43c5-b90e-16fdb09f082e-kube-api-access-qfcjg\") pod \"e9273c82-13c3-43c5-b90e-16fdb09f082e\" (UID: \"e9273c82-13c3-43c5-b90e-16fdb09f082e\") " Mar 01 09:56:04 crc kubenswrapper[4792]: I0301 09:56:04.325220 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9273c82-13c3-43c5-b90e-16fdb09f082e-kube-api-access-qfcjg" (OuterVolumeSpecName: "kube-api-access-qfcjg") pod "e9273c82-13c3-43c5-b90e-16fdb09f082e" (UID: "e9273c82-13c3-43c5-b90e-16fdb09f082e"). InnerVolumeSpecName "kube-api-access-qfcjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:56:04 crc kubenswrapper[4792]: I0301 09:56:04.420251 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfcjg\" (UniqueName: \"kubernetes.io/projected/e9273c82-13c3-43c5-b90e-16fdb09f082e-kube-api-access-qfcjg\") on node \"crc\" DevicePath \"\"" Mar 01 09:56:04 crc kubenswrapper[4792]: I0301 09:56:04.828099 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539316-8swf8" event={"ID":"e9273c82-13c3-43c5-b90e-16fdb09f082e","Type":"ContainerDied","Data":"1dad835c77013b58809a520c2651be9bac6b64755244aae7aa6a7ab543562843"} Mar 01 09:56:04 crc kubenswrapper[4792]: I0301 09:56:04.828139 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1dad835c77013b58809a520c2651be9bac6b64755244aae7aa6a7ab543562843" Mar 01 09:56:04 crc kubenswrapper[4792]: I0301 09:56:04.828152 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539316-8swf8" Mar 01 09:56:05 crc kubenswrapper[4792]: I0301 09:56:05.204724 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539310-cr6qh"] Mar 01 09:56:05 crc kubenswrapper[4792]: I0301 09:56:05.212094 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539310-cr6qh"] Mar 01 09:56:05 crc kubenswrapper[4792]: I0301 09:56:05.418005 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf147424-57ff-455c-9aac-e32adcab851e" path="/var/lib/kubelet/pods/bf147424-57ff-455c-9aac-e32adcab851e/volumes" Mar 01 09:56:22 crc kubenswrapper[4792]: I0301 09:56:22.953659 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gd957"] Mar 01 09:56:22 crc kubenswrapper[4792]: E0301 09:56:22.955727 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9273c82-13c3-43c5-b90e-16fdb09f082e" containerName="oc" Mar 01 09:56:22 crc kubenswrapper[4792]: I0301 09:56:22.955815 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9273c82-13c3-43c5-b90e-16fdb09f082e" containerName="oc" Mar 01 09:56:22 crc kubenswrapper[4792]: I0301 09:56:22.956098 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9273c82-13c3-43c5-b90e-16fdb09f082e" containerName="oc" Mar 01 09:56:22 crc kubenswrapper[4792]: I0301 09:56:22.957463 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gd957" Mar 01 09:56:22 crc kubenswrapper[4792]: I0301 09:56:22.970318 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gd957"] Mar 01 09:56:22 crc kubenswrapper[4792]: I0301 09:56:22.993397 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/334b1950-af06-4648-98dc-543534fc216a-utilities\") pod \"community-operators-gd957\" (UID: \"334b1950-af06-4648-98dc-543534fc216a\") " pod="openshift-marketplace/community-operators-gd957" Mar 01 09:56:22 crc kubenswrapper[4792]: I0301 09:56:22.993538 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfwsm\" (UniqueName: \"kubernetes.io/projected/334b1950-af06-4648-98dc-543534fc216a-kube-api-access-gfwsm\") pod \"community-operators-gd957\" (UID: \"334b1950-af06-4648-98dc-543534fc216a\") " pod="openshift-marketplace/community-operators-gd957" Mar 01 09:56:22 crc kubenswrapper[4792]: I0301 09:56:22.993637 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/334b1950-af06-4648-98dc-543534fc216a-catalog-content\") pod \"community-operators-gd957\" (UID: \"334b1950-af06-4648-98dc-543534fc216a\") " pod="openshift-marketplace/community-operators-gd957" Mar 01 09:56:23 crc kubenswrapper[4792]: I0301 09:56:23.096153 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/334b1950-af06-4648-98dc-543534fc216a-catalog-content\") pod \"community-operators-gd957\" (UID: \"334b1950-af06-4648-98dc-543534fc216a\") " pod="openshift-marketplace/community-operators-gd957" Mar 01 09:56:23 crc kubenswrapper[4792]: I0301 09:56:23.096233 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/334b1950-af06-4648-98dc-543534fc216a-utilities\") pod \"community-operators-gd957\" (UID: \"334b1950-af06-4648-98dc-543534fc216a\") " pod="openshift-marketplace/community-operators-gd957" Mar 01 09:56:23 crc kubenswrapper[4792]: I0301 09:56:23.096308 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfwsm\" (UniqueName: \"kubernetes.io/projected/334b1950-af06-4648-98dc-543534fc216a-kube-api-access-gfwsm\") pod \"community-operators-gd957\" (UID: \"334b1950-af06-4648-98dc-543534fc216a\") " pod="openshift-marketplace/community-operators-gd957" Mar 01 09:56:23 crc kubenswrapper[4792]: I0301 09:56:23.096793 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/334b1950-af06-4648-98dc-543534fc216a-catalog-content\") pod \"community-operators-gd957\" (UID: \"334b1950-af06-4648-98dc-543534fc216a\") " pod="openshift-marketplace/community-operators-gd957" Mar 01 09:56:23 crc kubenswrapper[4792]: I0301 09:56:23.096879 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/334b1950-af06-4648-98dc-543534fc216a-utilities\") pod \"community-operators-gd957\" (UID: \"334b1950-af06-4648-98dc-543534fc216a\") " pod="openshift-marketplace/community-operators-gd957" Mar 01 09:56:23 crc kubenswrapper[4792]: I0301 09:56:23.128532 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfwsm\" (UniqueName: \"kubernetes.io/projected/334b1950-af06-4648-98dc-543534fc216a-kube-api-access-gfwsm\") pod \"community-operators-gd957\" (UID: \"334b1950-af06-4648-98dc-543534fc216a\") " pod="openshift-marketplace/community-operators-gd957" Mar 01 09:56:23 crc kubenswrapper[4792]: I0301 09:56:23.310377 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gd957" Mar 01 09:56:23 crc kubenswrapper[4792]: I0301 09:56:23.838162 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gd957"] Mar 01 09:56:23 crc kubenswrapper[4792]: I0301 09:56:23.982013 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gd957" event={"ID":"334b1950-af06-4648-98dc-543534fc216a","Type":"ContainerStarted","Data":"87c281219a698791250b09d69a542e87c72aa7e1f46af736492aa9a12cf9e627"} Mar 01 09:56:24 crc kubenswrapper[4792]: I0301 09:56:24.990172 4792 generic.go:334] "Generic (PLEG): container finished" podID="334b1950-af06-4648-98dc-543534fc216a" containerID="519bc9f272a1ce5a790a51d0a515e723ba56516939324742b270e925ac89eb9a" exitCode=0 Mar 01 09:56:24 crc kubenswrapper[4792]: I0301 09:56:24.990246 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gd957" event={"ID":"334b1950-af06-4648-98dc-543534fc216a","Type":"ContainerDied","Data":"519bc9f272a1ce5a790a51d0a515e723ba56516939324742b270e925ac89eb9a"} Mar 01 09:56:26 crc kubenswrapper[4792]: I0301 09:56:25.999656 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gd957" event={"ID":"334b1950-af06-4648-98dc-543534fc216a","Type":"ContainerStarted","Data":"6bfcbdc1221bb94ee688c99c1c0085afbc86471f934e984a6a66410e9171f1b3"} Mar 01 09:56:27 crc kubenswrapper[4792]: I0301 09:56:27.010742 4792 generic.go:334] "Generic (PLEG): container finished" podID="334b1950-af06-4648-98dc-543534fc216a" containerID="6bfcbdc1221bb94ee688c99c1c0085afbc86471f934e984a6a66410e9171f1b3" exitCode=0 Mar 01 09:56:27 crc kubenswrapper[4792]: I0301 09:56:27.010788 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gd957" event={"ID":"334b1950-af06-4648-98dc-543534fc216a","Type":"ContainerDied","Data":"6bfcbdc1221bb94ee688c99c1c0085afbc86471f934e984a6a66410e9171f1b3"} Mar 01 09:56:28 crc kubenswrapper[4792]: I0301 09:56:28.021284 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gd957" event={"ID":"334b1950-af06-4648-98dc-543534fc216a","Type":"ContainerStarted","Data":"a59b136bc0800aa98bbdde4b0986c346a3752f954016918404b0d9226b16f2fe"} Mar 01 09:56:28 crc kubenswrapper[4792]: I0301 09:56:28.043709 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gd957" podStartSLOduration=3.6669071989999997 podStartE2EDuration="6.043687373s" podCreationTimestamp="2026-03-01 09:56:22 +0000 UTC" firstStartedPulling="2026-03-01 09:56:24.993047502 +0000 UTC m=+2914.234926699" lastFinishedPulling="2026-03-01 09:56:27.369827676 +0000 UTC m=+2916.611706873" observedRunningTime="2026-03-01 09:56:28.03873579 +0000 UTC m=+2917.280614987" watchObservedRunningTime="2026-03-01 09:56:28.043687373 +0000 UTC m=+2917.285566560" Mar 01 09:56:30 crc kubenswrapper[4792]: I0301 09:56:30.870459 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f6q2t"] Mar 01 09:56:30 crc kubenswrapper[4792]: I0301 09:56:30.874235 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f6q2t" Mar 01 09:56:30 crc kubenswrapper[4792]: I0301 09:56:30.882703 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f6q2t"] Mar 01 09:56:30 crc kubenswrapper[4792]: I0301 09:56:30.930368 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5hgj\" (UniqueName: \"kubernetes.io/projected/01308ffe-b402-42f1-8895-22ee5823304b-kube-api-access-g5hgj\") pod \"redhat-marketplace-f6q2t\" (UID: \"01308ffe-b402-42f1-8895-22ee5823304b\") " pod="openshift-marketplace/redhat-marketplace-f6q2t" Mar 01 09:56:30 crc kubenswrapper[4792]: I0301 09:56:30.930642 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01308ffe-b402-42f1-8895-22ee5823304b-catalog-content\") pod \"redhat-marketplace-f6q2t\" (UID: \"01308ffe-b402-42f1-8895-22ee5823304b\") " pod="openshift-marketplace/redhat-marketplace-f6q2t" Mar 01 09:56:30 crc kubenswrapper[4792]: I0301 09:56:30.930805 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01308ffe-b402-42f1-8895-22ee5823304b-utilities\") pod \"redhat-marketplace-f6q2t\" (UID: \"01308ffe-b402-42f1-8895-22ee5823304b\") " pod="openshift-marketplace/redhat-marketplace-f6q2t" Mar 01 09:56:31 crc kubenswrapper[4792]: I0301 09:56:31.032154 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01308ffe-b402-42f1-8895-22ee5823304b-catalog-content\") pod \"redhat-marketplace-f6q2t\" (UID: \"01308ffe-b402-42f1-8895-22ee5823304b\") " pod="openshift-marketplace/redhat-marketplace-f6q2t" Mar 01 09:56:31 crc kubenswrapper[4792]: I0301 09:56:31.032257 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01308ffe-b402-42f1-8895-22ee5823304b-utilities\") pod \"redhat-marketplace-f6q2t\" (UID: \"01308ffe-b402-42f1-8895-22ee5823304b\") " pod="openshift-marketplace/redhat-marketplace-f6q2t" Mar 01 09:56:31 crc kubenswrapper[4792]: I0301 09:56:31.032681 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01308ffe-b402-42f1-8895-22ee5823304b-catalog-content\") pod \"redhat-marketplace-f6q2t\" (UID: \"01308ffe-b402-42f1-8895-22ee5823304b\") " pod="openshift-marketplace/redhat-marketplace-f6q2t" Mar 01 09:56:31 crc kubenswrapper[4792]: I0301 09:56:31.032736 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5hgj\" (UniqueName: \"kubernetes.io/projected/01308ffe-b402-42f1-8895-22ee5823304b-kube-api-access-g5hgj\") pod \"redhat-marketplace-f6q2t\" (UID: \"01308ffe-b402-42f1-8895-22ee5823304b\") " pod="openshift-marketplace/redhat-marketplace-f6q2t" Mar 01 09:56:31 crc kubenswrapper[4792]: I0301 09:56:31.032835 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01308ffe-b402-42f1-8895-22ee5823304b-utilities\") pod \"redhat-marketplace-f6q2t\" (UID: \"01308ffe-b402-42f1-8895-22ee5823304b\") " pod="openshift-marketplace/redhat-marketplace-f6q2t" Mar 01 09:56:31 crc kubenswrapper[4792]: I0301 09:56:31.062654 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5hgj\" (UniqueName: \"kubernetes.io/projected/01308ffe-b402-42f1-8895-22ee5823304b-kube-api-access-g5hgj\") pod \"redhat-marketplace-f6q2t\" (UID: \"01308ffe-b402-42f1-8895-22ee5823304b\") " pod="openshift-marketplace/redhat-marketplace-f6q2t" Mar 01 09:56:31 crc kubenswrapper[4792]: I0301 09:56:31.198739 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f6q2t" Mar 01 09:56:31 crc kubenswrapper[4792]: I0301 09:56:31.564149 4792 scope.go:117] "RemoveContainer" containerID="3be8237f11ee8a9c2a66a6dce0cfdb5c72c7e1c7d5445dc46a852faf899f2940" Mar 01 09:56:31 crc kubenswrapper[4792]: I0301 09:56:31.667815 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f6q2t"] Mar 01 09:56:31 crc kubenswrapper[4792]: W0301 09:56:31.687382 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01308ffe_b402_42f1_8895_22ee5823304b.slice/crio-27f6a0cf37bf0be2aecf2f55c75d9a40b34c5ba6225ba36896bc7c41c0d112ec WatchSource:0}: Error finding container 27f6a0cf37bf0be2aecf2f55c75d9a40b34c5ba6225ba36896bc7c41c0d112ec: Status 404 returned error can't find the container with id 27f6a0cf37bf0be2aecf2f55c75d9a40b34c5ba6225ba36896bc7c41c0d112ec Mar 01 09:56:32 crc kubenswrapper[4792]: I0301 09:56:32.053079 4792 generic.go:334] "Generic (PLEG): container finished" podID="01308ffe-b402-42f1-8895-22ee5823304b" containerID="d15fdb0df7769f9939ae8c03c32db0f6fe6adfab99371fe79bbe6e95ddd1dfca" exitCode=0 Mar 01 09:56:32 crc kubenswrapper[4792]: I0301 09:56:32.053121 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6q2t" event={"ID":"01308ffe-b402-42f1-8895-22ee5823304b","Type":"ContainerDied","Data":"d15fdb0df7769f9939ae8c03c32db0f6fe6adfab99371fe79bbe6e95ddd1dfca"} Mar 01 09:56:32 crc kubenswrapper[4792]: I0301 09:56:32.053164 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6q2t" event={"ID":"01308ffe-b402-42f1-8895-22ee5823304b","Type":"ContainerStarted","Data":"27f6a0cf37bf0be2aecf2f55c75d9a40b34c5ba6225ba36896bc7c41c0d112ec"} Mar 01 09:56:33 crc kubenswrapper[4792]: I0301 09:56:33.063647 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6q2t" event={"ID":"01308ffe-b402-42f1-8895-22ee5823304b","Type":"ContainerStarted","Data":"ce11c9f3486d2dacf072897c2dfb0af5686df38bebcd0809b74c3bd8809a6200"} Mar 01 09:56:33 crc kubenswrapper[4792]: I0301 09:56:33.311153 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gd957" Mar 01 09:56:33 crc kubenswrapper[4792]: I0301 09:56:33.311209 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gd957" Mar 01 09:56:33 crc kubenswrapper[4792]: I0301 09:56:33.496037 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gd957" Mar 01 09:56:34 crc kubenswrapper[4792]: I0301 09:56:34.117440 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gd957" Mar 01 09:56:35 crc kubenswrapper[4792]: I0301 09:56:35.079070 4792 generic.go:334] "Generic (PLEG): container finished" podID="01308ffe-b402-42f1-8895-22ee5823304b" containerID="ce11c9f3486d2dacf072897c2dfb0af5686df38bebcd0809b74c3bd8809a6200" exitCode=0 Mar 01 09:56:35 crc kubenswrapper[4792]: I0301 09:56:35.079156 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6q2t" event={"ID":"01308ffe-b402-42f1-8895-22ee5823304b","Type":"ContainerDied","Data":"ce11c9f3486d2dacf072897c2dfb0af5686df38bebcd0809b74c3bd8809a6200"} Mar 01 09:56:35 crc kubenswrapper[4792]: I0301 09:56:35.916697 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gd957"] Mar 01 09:56:36 crc kubenswrapper[4792]: I0301 09:56:36.089341 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gd957" podUID="334b1950-af06-4648-98dc-543534fc216a" containerName="registry-server" containerID="cri-o://a59b136bc0800aa98bbdde4b0986c346a3752f954016918404b0d9226b16f2fe" gracePeriod=2 Mar 01 09:56:36 crc kubenswrapper[4792]: I0301 09:56:36.089790 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6q2t" event={"ID":"01308ffe-b402-42f1-8895-22ee5823304b","Type":"ContainerStarted","Data":"eb816dd8310f585c322a71e004114ee79875bc1a853db9c0adc5bbf211eb0696"} Mar 01 09:56:36 crc kubenswrapper[4792]: I0301 09:56:36.112311 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f6q2t" podStartSLOduration=2.567439225 podStartE2EDuration="6.112290979s" podCreationTimestamp="2026-03-01 09:56:30 +0000 UTC" firstStartedPulling="2026-03-01 09:56:32.055549364 +0000 UTC m=+2921.297428561" lastFinishedPulling="2026-03-01 09:56:35.600401118 +0000 UTC m=+2924.842280315" observedRunningTime="2026-03-01 09:56:36.108197147 +0000 UTC m=+2925.350076344" watchObservedRunningTime="2026-03-01 09:56:36.112290979 +0000 UTC m=+2925.354170176" Mar 01 09:56:36 crc kubenswrapper[4792]: I0301 09:56:36.578787 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gd957" Mar 01 09:56:36 crc kubenswrapper[4792]: I0301 09:56:36.639048 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/334b1950-af06-4648-98dc-543534fc216a-catalog-content\") pod \"334b1950-af06-4648-98dc-543534fc216a\" (UID: \"334b1950-af06-4648-98dc-543534fc216a\") " Mar 01 09:56:36 crc kubenswrapper[4792]: I0301 09:56:36.639185 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfwsm\" (UniqueName: \"kubernetes.io/projected/334b1950-af06-4648-98dc-543534fc216a-kube-api-access-gfwsm\") pod \"334b1950-af06-4648-98dc-543534fc216a\" (UID: \"334b1950-af06-4648-98dc-543534fc216a\") " Mar 01 09:56:36 crc kubenswrapper[4792]: I0301 09:56:36.639252 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/334b1950-af06-4648-98dc-543534fc216a-utilities\") pod \"334b1950-af06-4648-98dc-543534fc216a\" (UID: \"334b1950-af06-4648-98dc-543534fc216a\") " Mar 01 09:56:36 crc kubenswrapper[4792]: I0301 09:56:36.640128 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/334b1950-af06-4648-98dc-543534fc216a-utilities" (OuterVolumeSpecName: "utilities") pod "334b1950-af06-4648-98dc-543534fc216a" (UID: "334b1950-af06-4648-98dc-543534fc216a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:56:36 crc kubenswrapper[4792]: I0301 09:56:36.645843 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/334b1950-af06-4648-98dc-543534fc216a-kube-api-access-gfwsm" (OuterVolumeSpecName: "kube-api-access-gfwsm") pod "334b1950-af06-4648-98dc-543534fc216a" (UID: "334b1950-af06-4648-98dc-543534fc216a"). InnerVolumeSpecName "kube-api-access-gfwsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:56:36 crc kubenswrapper[4792]: I0301 09:56:36.716743 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/334b1950-af06-4648-98dc-543534fc216a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "334b1950-af06-4648-98dc-543534fc216a" (UID: "334b1950-af06-4648-98dc-543534fc216a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:56:36 crc kubenswrapper[4792]: I0301 09:56:36.741059 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfwsm\" (UniqueName: \"kubernetes.io/projected/334b1950-af06-4648-98dc-543534fc216a-kube-api-access-gfwsm\") on node \"crc\" DevicePath \"\"" Mar 01 09:56:36 crc kubenswrapper[4792]: I0301 09:56:36.741091 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/334b1950-af06-4648-98dc-543534fc216a-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:56:36 crc kubenswrapper[4792]: I0301 09:56:36.741100 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/334b1950-af06-4648-98dc-543534fc216a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:56:37 crc kubenswrapper[4792]: I0301 09:56:37.108856 4792 generic.go:334] "Generic (PLEG): container finished" podID="334b1950-af06-4648-98dc-543534fc216a" containerID="a59b136bc0800aa98bbdde4b0986c346a3752f954016918404b0d9226b16f2fe" exitCode=0 Mar 01 09:56:37 crc kubenswrapper[4792]: I0301 09:56:37.110230 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gd957" event={"ID":"334b1950-af06-4648-98dc-543534fc216a","Type":"ContainerDied","Data":"a59b136bc0800aa98bbdde4b0986c346a3752f954016918404b0d9226b16f2fe"} Mar 01 09:56:37 crc kubenswrapper[4792]: I0301 09:56:37.110268 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gd957" event={"ID":"334b1950-af06-4648-98dc-543534fc216a","Type":"ContainerDied","Data":"87c281219a698791250b09d69a542e87c72aa7e1f46af736492aa9a12cf9e627"} Mar 01 09:56:37 crc kubenswrapper[4792]: I0301 09:56:37.110286 4792 scope.go:117] "RemoveContainer" containerID="a59b136bc0800aa98bbdde4b0986c346a3752f954016918404b0d9226b16f2fe" Mar 01 09:56:37 crc kubenswrapper[4792]: I0301 09:56:37.110505 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gd957" Mar 01 09:56:37 crc kubenswrapper[4792]: I0301 09:56:37.147346 4792 scope.go:117] "RemoveContainer" containerID="6bfcbdc1221bb94ee688c99c1c0085afbc86471f934e984a6a66410e9171f1b3" Mar 01 09:56:37 crc kubenswrapper[4792]: I0301 09:56:37.199203 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gd957"] Mar 01 09:56:37 crc kubenswrapper[4792]: I0301 09:56:37.210287 4792 scope.go:117] "RemoveContainer" containerID="519bc9f272a1ce5a790a51d0a515e723ba56516939324742b270e925ac89eb9a" Mar 01 09:56:37 crc kubenswrapper[4792]: I0301 09:56:37.218559 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gd957"] Mar 01 09:56:37 crc kubenswrapper[4792]: I0301 09:56:37.248081 4792 scope.go:117] "RemoveContainer" containerID="a59b136bc0800aa98bbdde4b0986c346a3752f954016918404b0d9226b16f2fe" Mar 01 09:56:37 crc kubenswrapper[4792]: E0301 09:56:37.251983 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a59b136bc0800aa98bbdde4b0986c346a3752f954016918404b0d9226b16f2fe\": container with ID starting with a59b136bc0800aa98bbdde4b0986c346a3752f954016918404b0d9226b16f2fe not found: ID does not exist" containerID="a59b136bc0800aa98bbdde4b0986c346a3752f954016918404b0d9226b16f2fe" Mar 01 09:56:37 crc kubenswrapper[4792]: I0301 09:56:37.252027 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a59b136bc0800aa98bbdde4b0986c346a3752f954016918404b0d9226b16f2fe"} err="failed to get container status \"a59b136bc0800aa98bbdde4b0986c346a3752f954016918404b0d9226b16f2fe\": rpc error: code = NotFound desc = could not find container \"a59b136bc0800aa98bbdde4b0986c346a3752f954016918404b0d9226b16f2fe\": container with ID starting with a59b136bc0800aa98bbdde4b0986c346a3752f954016918404b0d9226b16f2fe not found: ID does not exist" Mar 01 09:56:37 crc kubenswrapper[4792]: I0301 09:56:37.252058 4792 scope.go:117] "RemoveContainer" containerID="6bfcbdc1221bb94ee688c99c1c0085afbc86471f934e984a6a66410e9171f1b3" Mar 01 09:56:37 crc kubenswrapper[4792]: E0301 09:56:37.255358 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bfcbdc1221bb94ee688c99c1c0085afbc86471f934e984a6a66410e9171f1b3\": container with ID starting with 6bfcbdc1221bb94ee688c99c1c0085afbc86471f934e984a6a66410e9171f1b3 not found: ID does not exist" containerID="6bfcbdc1221bb94ee688c99c1c0085afbc86471f934e984a6a66410e9171f1b3" Mar 01 09:56:37 crc kubenswrapper[4792]: I0301 09:56:37.255417 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bfcbdc1221bb94ee688c99c1c0085afbc86471f934e984a6a66410e9171f1b3"} err="failed to get container status \"6bfcbdc1221bb94ee688c99c1c0085afbc86471f934e984a6a66410e9171f1b3\": rpc error: code = NotFound desc = could not find container \"6bfcbdc1221bb94ee688c99c1c0085afbc86471f934e984a6a66410e9171f1b3\": container with ID starting with 6bfcbdc1221bb94ee688c99c1c0085afbc86471f934e984a6a66410e9171f1b3 not found: ID does not exist" Mar 01 09:56:37 crc kubenswrapper[4792]: I0301 09:56:37.255440 4792 scope.go:117] "RemoveContainer" containerID="519bc9f272a1ce5a790a51d0a515e723ba56516939324742b270e925ac89eb9a" Mar 01 09:56:37 crc kubenswrapper[4792]: E0301 09:56:37.262247 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"519bc9f272a1ce5a790a51d0a515e723ba56516939324742b270e925ac89eb9a\": container with ID starting with 519bc9f272a1ce5a790a51d0a515e723ba56516939324742b270e925ac89eb9a not found: ID does not exist" containerID="519bc9f272a1ce5a790a51d0a515e723ba56516939324742b270e925ac89eb9a" Mar 01 09:56:37 crc kubenswrapper[4792]: I0301 09:56:37.262295 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"519bc9f272a1ce5a790a51d0a515e723ba56516939324742b270e925ac89eb9a"} err="failed to get container status \"519bc9f272a1ce5a790a51d0a515e723ba56516939324742b270e925ac89eb9a\": rpc error: code = NotFound desc = could not find container \"519bc9f272a1ce5a790a51d0a515e723ba56516939324742b270e925ac89eb9a\": container with ID starting with 519bc9f272a1ce5a790a51d0a515e723ba56516939324742b270e925ac89eb9a not found: ID does not exist" Mar 01 09:56:37 crc kubenswrapper[4792]: I0301 09:56:37.420215 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="334b1950-af06-4648-98dc-543534fc216a" path="/var/lib/kubelet/pods/334b1950-af06-4648-98dc-543534fc216a/volumes" Mar 01 09:56:41 crc kubenswrapper[4792]: I0301 09:56:41.199702 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f6q2t" Mar 01 09:56:41 crc kubenswrapper[4792]: I0301 09:56:41.200275 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f6q2t" Mar 01 09:56:41 crc kubenswrapper[4792]: I0301 09:56:41.294177 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f6q2t" Mar 01 09:56:42 crc kubenswrapper[4792]: I0301 09:56:42.208805 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f6q2t" Mar 01 09:56:42 crc kubenswrapper[4792]: I0301 09:56:42.259000 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f6q2t"] Mar 01 09:56:44 crc kubenswrapper[4792]: I0301 09:56:44.174216 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f6q2t" podUID="01308ffe-b402-42f1-8895-22ee5823304b" containerName="registry-server" containerID="cri-o://eb816dd8310f585c322a71e004114ee79875bc1a853db9c0adc5bbf211eb0696" gracePeriod=2 Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.089644 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f6q2t" Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.182724 4792 generic.go:334] "Generic (PLEG): container finished" podID="01308ffe-b402-42f1-8895-22ee5823304b" containerID="eb816dd8310f585c322a71e004114ee79875bc1a853db9c0adc5bbf211eb0696" exitCode=0 Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.182776 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6q2t" event={"ID":"01308ffe-b402-42f1-8895-22ee5823304b","Type":"ContainerDied","Data":"eb816dd8310f585c322a71e004114ee79875bc1a853db9c0adc5bbf211eb0696"} Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.182872 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6q2t" event={"ID":"01308ffe-b402-42f1-8895-22ee5823304b","Type":"ContainerDied","Data":"27f6a0cf37bf0be2aecf2f55c75d9a40b34c5ba6225ba36896bc7c41c0d112ec"} Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.182911 4792 scope.go:117] "RemoveContainer" containerID="eb816dd8310f585c322a71e004114ee79875bc1a853db9c0adc5bbf211eb0696" Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.183744 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f6q2t" Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.207547 4792 scope.go:117] "RemoveContainer" containerID="ce11c9f3486d2dacf072897c2dfb0af5686df38bebcd0809b74c3bd8809a6200" Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.230926 4792 scope.go:117] "RemoveContainer" containerID="d15fdb0df7769f9939ae8c03c32db0f6fe6adfab99371fe79bbe6e95ddd1dfca" Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.270194 4792 scope.go:117] "RemoveContainer" containerID="eb816dd8310f585c322a71e004114ee79875bc1a853db9c0adc5bbf211eb0696" Mar 01 09:56:45 crc kubenswrapper[4792]: E0301 09:56:45.270539 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb816dd8310f585c322a71e004114ee79875bc1a853db9c0adc5bbf211eb0696\": container with ID starting with eb816dd8310f585c322a71e004114ee79875bc1a853db9c0adc5bbf211eb0696 not found: ID does not exist" containerID="eb816dd8310f585c322a71e004114ee79875bc1a853db9c0adc5bbf211eb0696" Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.270574 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb816dd8310f585c322a71e004114ee79875bc1a853db9c0adc5bbf211eb0696"} err="failed to get container status \"eb816dd8310f585c322a71e004114ee79875bc1a853db9c0adc5bbf211eb0696\": rpc error: code = NotFound desc = could not find container \"eb816dd8310f585c322a71e004114ee79875bc1a853db9c0adc5bbf211eb0696\": container with ID starting with eb816dd8310f585c322a71e004114ee79875bc1a853db9c0adc5bbf211eb0696 not found: ID does not exist" Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.270595 4792 scope.go:117] "RemoveContainer" containerID="ce11c9f3486d2dacf072897c2dfb0af5686df38bebcd0809b74c3bd8809a6200" Mar 01 09:56:45 crc kubenswrapper[4792]: E0301 09:56:45.271045 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce11c9f3486d2dacf072897c2dfb0af5686df38bebcd0809b74c3bd8809a6200\": container with ID starting with ce11c9f3486d2dacf072897c2dfb0af5686df38bebcd0809b74c3bd8809a6200 not found: ID does not exist" containerID="ce11c9f3486d2dacf072897c2dfb0af5686df38bebcd0809b74c3bd8809a6200" Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.271069 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce11c9f3486d2dacf072897c2dfb0af5686df38bebcd0809b74c3bd8809a6200"} err="failed to get container status \"ce11c9f3486d2dacf072897c2dfb0af5686df38bebcd0809b74c3bd8809a6200\": rpc error: code = NotFound desc = could not find container \"ce11c9f3486d2dacf072897c2dfb0af5686df38bebcd0809b74c3bd8809a6200\": container with ID starting with ce11c9f3486d2dacf072897c2dfb0af5686df38bebcd0809b74c3bd8809a6200 not found: ID does not exist" Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.271084 4792 scope.go:117] "RemoveContainer" containerID="d15fdb0df7769f9939ae8c03c32db0f6fe6adfab99371fe79bbe6e95ddd1dfca" Mar 01 09:56:45 crc kubenswrapper[4792]: E0301 09:56:45.271361 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d15fdb0df7769f9939ae8c03c32db0f6fe6adfab99371fe79bbe6e95ddd1dfca\": container with ID starting with d15fdb0df7769f9939ae8c03c32db0f6fe6adfab99371fe79bbe6e95ddd1dfca not found: ID does not exist" containerID="d15fdb0df7769f9939ae8c03c32db0f6fe6adfab99371fe79bbe6e95ddd1dfca" Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.271391 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d15fdb0df7769f9939ae8c03c32db0f6fe6adfab99371fe79bbe6e95ddd1dfca"} err="failed to get container status \"d15fdb0df7769f9939ae8c03c32db0f6fe6adfab99371fe79bbe6e95ddd1dfca\": rpc error: code = NotFound desc = could not find container \"d15fdb0df7769f9939ae8c03c32db0f6fe6adfab99371fe79bbe6e95ddd1dfca\": container with ID starting with d15fdb0df7769f9939ae8c03c32db0f6fe6adfab99371fe79bbe6e95ddd1dfca not found: ID does not exist" Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.287393 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5hgj\" (UniqueName: \"kubernetes.io/projected/01308ffe-b402-42f1-8895-22ee5823304b-kube-api-access-g5hgj\") pod \"01308ffe-b402-42f1-8895-22ee5823304b\" (UID: \"01308ffe-b402-42f1-8895-22ee5823304b\") " Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.287635 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01308ffe-b402-42f1-8895-22ee5823304b-utilities\") pod \"01308ffe-b402-42f1-8895-22ee5823304b\" (UID: \"01308ffe-b402-42f1-8895-22ee5823304b\") " Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.287782 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01308ffe-b402-42f1-8895-22ee5823304b-catalog-content\") pod \"01308ffe-b402-42f1-8895-22ee5823304b\" (UID: \"01308ffe-b402-42f1-8895-22ee5823304b\") " Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.290778 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01308ffe-b402-42f1-8895-22ee5823304b-utilities" (OuterVolumeSpecName: "utilities") pod "01308ffe-b402-42f1-8895-22ee5823304b" (UID: "01308ffe-b402-42f1-8895-22ee5823304b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.293455 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01308ffe-b402-42f1-8895-22ee5823304b-kube-api-access-g5hgj" (OuterVolumeSpecName: "kube-api-access-g5hgj") pod "01308ffe-b402-42f1-8895-22ee5823304b" (UID: "01308ffe-b402-42f1-8895-22ee5823304b"). InnerVolumeSpecName "kube-api-access-g5hgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:56:45 crc kubenswrapper[4792]: I0301 09:56:45.317694 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01308ffe-b402-42f1-8895-22ee5823304b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "01308ffe-b402-42f1-8895-22ee5823304b" (UID: "01308ffe-b402-42f1-8895-22ee5823304b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:56:46 crc kubenswrapper[4792]: I0301 09:56:46.040188 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01308ffe-b402-42f1-8895-22ee5823304b-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:56:46 crc kubenswrapper[4792]: I0301 09:56:46.040242 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01308ffe-b402-42f1-8895-22ee5823304b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:56:46 crc kubenswrapper[4792]: I0301 09:56:46.040259 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5hgj\" (UniqueName: \"kubernetes.io/projected/01308ffe-b402-42f1-8895-22ee5823304b-kube-api-access-g5hgj\") on node \"crc\" DevicePath \"\"" Mar 01 09:56:46 crc kubenswrapper[4792]: I0301 09:56:46.143180 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f6q2t"] Mar 01 09:56:46 crc kubenswrapper[4792]: I0301 09:56:46.188474 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f6q2t"] Mar 01 09:56:47 crc kubenswrapper[4792]: I0301 09:56:47.421891 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01308ffe-b402-42f1-8895-22ee5823304b" path="/var/lib/kubelet/pods/01308ffe-b402-42f1-8895-22ee5823304b/volumes" Mar 01 09:57:02 crc kubenswrapper[4792]: I0301 09:57:02.900248 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-srpk8"] Mar 01 09:57:02 crc kubenswrapper[4792]: E0301 09:57:02.901190 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01308ffe-b402-42f1-8895-22ee5823304b" containerName="extract-utilities" Mar 01 09:57:02 crc kubenswrapper[4792]: I0301 09:57:02.901206 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="01308ffe-b402-42f1-8895-22ee5823304b" containerName="extract-utilities" Mar 01 09:57:02 crc kubenswrapper[4792]: E0301 09:57:02.901224 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="334b1950-af06-4648-98dc-543534fc216a" containerName="extract-content" Mar 01 09:57:02 crc kubenswrapper[4792]: I0301 09:57:02.901233 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="334b1950-af06-4648-98dc-543534fc216a" containerName="extract-content" Mar 01 09:57:02 crc kubenswrapper[4792]: E0301 09:57:02.901243 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01308ffe-b402-42f1-8895-22ee5823304b" containerName="extract-content" Mar 01 09:57:02 crc kubenswrapper[4792]: I0301 09:57:02.901251 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="01308ffe-b402-42f1-8895-22ee5823304b" containerName="extract-content" Mar 01 09:57:02 crc kubenswrapper[4792]: E0301 09:57:02.901279 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="334b1950-af06-4648-98dc-543534fc216a" containerName="extract-utilities" Mar 01 09:57:02 crc kubenswrapper[4792]: I0301 09:57:02.901288 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="334b1950-af06-4648-98dc-543534fc216a" containerName="extract-utilities" Mar 01 09:57:02 crc kubenswrapper[4792]: E0301 09:57:02.901303 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01308ffe-b402-42f1-8895-22ee5823304b" containerName="registry-server" Mar 01 09:57:02 crc kubenswrapper[4792]: I0301 09:57:02.901312 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="01308ffe-b402-42f1-8895-22ee5823304b" containerName="registry-server" Mar 01 09:57:02 crc kubenswrapper[4792]: E0301 09:57:02.901362 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="334b1950-af06-4648-98dc-543534fc216a" containerName="registry-server" Mar 01 09:57:02 crc kubenswrapper[4792]: I0301 09:57:02.901371 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="334b1950-af06-4648-98dc-543534fc216a" containerName="registry-server" Mar 01 09:57:02 crc kubenswrapper[4792]: I0301 09:57:02.901605 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="334b1950-af06-4648-98dc-543534fc216a" containerName="registry-server" Mar 01 09:57:02 crc kubenswrapper[4792]: I0301 09:57:02.901629 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="01308ffe-b402-42f1-8895-22ee5823304b" containerName="registry-server" Mar 01 09:57:02 crc kubenswrapper[4792]: I0301 09:57:02.903347 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-srpk8" Mar 01 09:57:02 crc kubenswrapper[4792]: I0301 09:57:02.913942 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-srpk8"] Mar 01 09:57:03 crc kubenswrapper[4792]: I0301 09:57:03.035488 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b-catalog-content\") pod \"certified-operators-srpk8\" (UID: \"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b\") " pod="openshift-marketplace/certified-operators-srpk8" Mar 01 09:57:03 crc kubenswrapper[4792]: I0301 09:57:03.035887 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x246v\" (UniqueName: \"kubernetes.io/projected/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b-kube-api-access-x246v\") pod \"certified-operators-srpk8\" (UID: \"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b\") " pod="openshift-marketplace/certified-operators-srpk8" Mar 01 09:57:03 crc kubenswrapper[4792]: I0301 09:57:03.036117 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b-utilities\") pod \"certified-operators-srpk8\" (UID: \"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b\") " pod="openshift-marketplace/certified-operators-srpk8" Mar 01 09:57:03 crc kubenswrapper[4792]: I0301 09:57:03.138367 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b-utilities\") pod \"certified-operators-srpk8\" (UID: \"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b\") " pod="openshift-marketplace/certified-operators-srpk8" Mar 01 09:57:03 crc kubenswrapper[4792]: I0301 09:57:03.138467 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b-catalog-content\") pod \"certified-operators-srpk8\" (UID: \"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b\") " pod="openshift-marketplace/certified-operators-srpk8" Mar 01 09:57:03 crc kubenswrapper[4792]: I0301 09:57:03.138495 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x246v\" (UniqueName: \"kubernetes.io/projected/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b-kube-api-access-x246v\") pod \"certified-operators-srpk8\" (UID: \"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b\") " pod="openshift-marketplace/certified-operators-srpk8" Mar 01 09:57:03 crc kubenswrapper[4792]: I0301 09:57:03.138951 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b-utilities\") pod \"certified-operators-srpk8\" (UID: \"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b\") " pod="openshift-marketplace/certified-operators-srpk8" Mar 01 09:57:03 crc kubenswrapper[4792]: I0301 09:57:03.139073 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b-catalog-content\") pod \"certified-operators-srpk8\" (UID: \"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b\") " pod="openshift-marketplace/certified-operators-srpk8" Mar 01 09:57:03 crc kubenswrapper[4792]: I0301 09:57:03.169449 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x246v\" (UniqueName: \"kubernetes.io/projected/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b-kube-api-access-x246v\") pod \"certified-operators-srpk8\" (UID: \"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b\") " pod="openshift-marketplace/certified-operators-srpk8" Mar 01 09:57:03 crc kubenswrapper[4792]: I0301 09:57:03.296157 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-srpk8" Mar 01 09:57:03 crc kubenswrapper[4792]: I0301 09:57:03.860475 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-srpk8"] Mar 01 09:57:04 crc kubenswrapper[4792]: I0301 09:57:04.349249 4792 generic.go:334] "Generic (PLEG): container finished" podID="b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b" containerID="c9012413d5c592c5b40c5f928a05c5b3f4829ea8e19af19dc3f1a9cf031570d0" exitCode=0 Mar 01 09:57:04 crc kubenswrapper[4792]: I0301 09:57:04.349532 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srpk8" event={"ID":"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b","Type":"ContainerDied","Data":"c9012413d5c592c5b40c5f928a05c5b3f4829ea8e19af19dc3f1a9cf031570d0"} Mar 01 09:57:04 crc kubenswrapper[4792]: I0301 09:57:04.349558 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srpk8" event={"ID":"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b","Type":"ContainerStarted","Data":"08062e533d06ffaf9308c7a8f8ae79e0e90f4f7bd9be17fa74bd0dac1d520e96"} Mar 01 09:57:04 crc kubenswrapper[4792]: I0301 09:57:04.943477 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:57:04 crc kubenswrapper[4792]: I0301 09:57:04.943778 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:57:05 crc kubenswrapper[4792]: I0301 09:57:05.359232 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srpk8" event={"ID":"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b","Type":"ContainerStarted","Data":"d38cdb1d4619e258f809a277bacae0934764dfb7a7db89bbc071f45136202a0a"} Mar 01 09:57:08 crc kubenswrapper[4792]: I0301 09:57:08.385575 4792 generic.go:334] "Generic (PLEG): container finished" podID="b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b" containerID="d38cdb1d4619e258f809a277bacae0934764dfb7a7db89bbc071f45136202a0a" exitCode=0 Mar 01 09:57:08 crc kubenswrapper[4792]: I0301 09:57:08.385655 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srpk8" event={"ID":"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b","Type":"ContainerDied","Data":"d38cdb1d4619e258f809a277bacae0934764dfb7a7db89bbc071f45136202a0a"} Mar 01 09:57:09 crc kubenswrapper[4792]: I0301 09:57:09.396353 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srpk8" event={"ID":"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b","Type":"ContainerStarted","Data":"1e8631ae9531f7382b5e81ac945cf2990503490ca014a18f568e761944deaa05"} Mar 01 09:57:09 crc kubenswrapper[4792]: I0301 09:57:09.440128 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-srpk8" podStartSLOduration=2.919147154 podStartE2EDuration="7.440108657s" podCreationTimestamp="2026-03-01 09:57:02 +0000 UTC" firstStartedPulling="2026-03-01 09:57:04.352192228 +0000 UTC m=+2953.594071415" lastFinishedPulling="2026-03-01 09:57:08.873153721 +0000 UTC m=+2958.115032918" observedRunningTime="2026-03-01 09:57:09.428545271 +0000 UTC m=+2958.670424468" watchObservedRunningTime="2026-03-01 09:57:09.440108657 +0000 UTC m=+2958.681987854" Mar 01 09:57:13 crc kubenswrapper[4792]: I0301 09:57:13.297457 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-srpk8" Mar 01 09:57:13 crc kubenswrapper[4792]: I0301 09:57:13.297751 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-srpk8" Mar 01 09:57:13 crc kubenswrapper[4792]: I0301 09:57:13.347187 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-srpk8" Mar 01 09:57:23 crc kubenswrapper[4792]: I0301 09:57:23.371481 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-srpk8" Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.049970 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-srpk8"] Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.050250 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-srpk8" podUID="b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b" containerName="registry-server" containerID="cri-o://1e8631ae9531f7382b5e81ac945cf2990503490ca014a18f568e761944deaa05" gracePeriod=2 Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.505406 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-srpk8" Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.558037 4792 generic.go:334] "Generic (PLEG): container finished" podID="b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b" containerID="1e8631ae9531f7382b5e81ac945cf2990503490ca014a18f568e761944deaa05" exitCode=0 Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.558078 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srpk8" event={"ID":"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b","Type":"ContainerDied","Data":"1e8631ae9531f7382b5e81ac945cf2990503490ca014a18f568e761944deaa05"} Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.558104 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srpk8" event={"ID":"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b","Type":"ContainerDied","Data":"08062e533d06ffaf9308c7a8f8ae79e0e90f4f7bd9be17fa74bd0dac1d520e96"} Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.558120 4792 scope.go:117] "RemoveContainer" containerID="1e8631ae9531f7382b5e81ac945cf2990503490ca014a18f568e761944deaa05" Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.558209 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-srpk8" Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.585895 4792 scope.go:117] "RemoveContainer" containerID="d38cdb1d4619e258f809a277bacae0934764dfb7a7db89bbc071f45136202a0a" Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.613116 4792 scope.go:117] "RemoveContainer" containerID="c9012413d5c592c5b40c5f928a05c5b3f4829ea8e19af19dc3f1a9cf031570d0" Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.649464 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b-utilities\") pod \"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b\" (UID: \"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b\") " Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.649545 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x246v\" (UniqueName: \"kubernetes.io/projected/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b-kube-api-access-x246v\") pod \"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b\" (UID: \"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b\") " Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.649568 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b-catalog-content\") pod \"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b\" (UID: \"b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b\") " Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.652971 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b-utilities" (OuterVolumeSpecName: "utilities") pod "b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b" (UID: "b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.658938 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b-kube-api-access-x246v" (OuterVolumeSpecName: "kube-api-access-x246v") pod "b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b" (UID: "b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b"). InnerVolumeSpecName "kube-api-access-x246v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.661932 4792 scope.go:117] "RemoveContainer" containerID="1e8631ae9531f7382b5e81ac945cf2990503490ca014a18f568e761944deaa05" Mar 01 09:57:26 crc kubenswrapper[4792]: E0301 09:57:26.662430 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e8631ae9531f7382b5e81ac945cf2990503490ca014a18f568e761944deaa05\": container with ID starting with 1e8631ae9531f7382b5e81ac945cf2990503490ca014a18f568e761944deaa05 not found: ID does not exist" containerID="1e8631ae9531f7382b5e81ac945cf2990503490ca014a18f568e761944deaa05" Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.662457 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e8631ae9531f7382b5e81ac945cf2990503490ca014a18f568e761944deaa05"} err="failed to get container status \"1e8631ae9531f7382b5e81ac945cf2990503490ca014a18f568e761944deaa05\": rpc error: code = NotFound desc = could not find container \"1e8631ae9531f7382b5e81ac945cf2990503490ca014a18f568e761944deaa05\": container with ID starting with 1e8631ae9531f7382b5e81ac945cf2990503490ca014a18f568e761944deaa05 not found: ID does not exist" Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.662477 4792 scope.go:117] "RemoveContainer" containerID="d38cdb1d4619e258f809a277bacae0934764dfb7a7db89bbc071f45136202a0a" Mar 01 09:57:26 crc kubenswrapper[4792]: E0301 09:57:26.662765 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d38cdb1d4619e258f809a277bacae0934764dfb7a7db89bbc071f45136202a0a\": container with ID starting with d38cdb1d4619e258f809a277bacae0934764dfb7a7db89bbc071f45136202a0a not found: ID does not exist" containerID="d38cdb1d4619e258f809a277bacae0934764dfb7a7db89bbc071f45136202a0a" Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.662784 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d38cdb1d4619e258f809a277bacae0934764dfb7a7db89bbc071f45136202a0a"} err="failed to get container status \"d38cdb1d4619e258f809a277bacae0934764dfb7a7db89bbc071f45136202a0a\": rpc error: code = NotFound desc = could not find container \"d38cdb1d4619e258f809a277bacae0934764dfb7a7db89bbc071f45136202a0a\": container with ID starting with d38cdb1d4619e258f809a277bacae0934764dfb7a7db89bbc071f45136202a0a not found: ID does not exist" Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.662798 4792 scope.go:117] "RemoveContainer" containerID="c9012413d5c592c5b40c5f928a05c5b3f4829ea8e19af19dc3f1a9cf031570d0" Mar 01 09:57:26 crc kubenswrapper[4792]: E0301 09:57:26.663261 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9012413d5c592c5b40c5f928a05c5b3f4829ea8e19af19dc3f1a9cf031570d0\": container with ID starting with c9012413d5c592c5b40c5f928a05c5b3f4829ea8e19af19dc3f1a9cf031570d0 not found: ID does not exist" containerID="c9012413d5c592c5b40c5f928a05c5b3f4829ea8e19af19dc3f1a9cf031570d0" Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.663286 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9012413d5c592c5b40c5f928a05c5b3f4829ea8e19af19dc3f1a9cf031570d0"} err="failed to get container status \"c9012413d5c592c5b40c5f928a05c5b3f4829ea8e19af19dc3f1a9cf031570d0\": rpc error: code = NotFound desc = could not find container \"c9012413d5c592c5b40c5f928a05c5b3f4829ea8e19af19dc3f1a9cf031570d0\": container with ID starting with c9012413d5c592c5b40c5f928a05c5b3f4829ea8e19af19dc3f1a9cf031570d0 not found: ID does not exist" Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.709799 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b" (UID: "b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.752003 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.752068 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x246v\" (UniqueName: \"kubernetes.io/projected/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b-kube-api-access-x246v\") on node \"crc\" DevicePath \"\"" Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.752081 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.890173 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-srpk8"] Mar 01 09:57:26 crc kubenswrapper[4792]: I0301 09:57:26.897657 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-srpk8"] Mar 01 09:57:27 crc kubenswrapper[4792]: I0301 09:57:27.419600 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b" path="/var/lib/kubelet/pods/b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b/volumes" Mar 01 09:57:34 crc kubenswrapper[4792]: I0301 09:57:34.943329 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:57:34 crc kubenswrapper[4792]: I0301 09:57:34.943924 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:57:44 crc kubenswrapper[4792]: I0301 09:57:44.706179 4792 generic.go:334] "Generic (PLEG): container finished" podID="c7230f65-7e9a-4455-8d25-c49393bfbafe" containerID="e854573aa1d54c447e18219253a483ccbce7dfebd37e9e5c1c0e176ad1346674" exitCode=0 Mar 01 09:57:44 crc kubenswrapper[4792]: I0301 09:57:44.706270 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" event={"ID":"c7230f65-7e9a-4455-8d25-c49393bfbafe","Type":"ContainerDied","Data":"e854573aa1d54c447e18219253a483ccbce7dfebd37e9e5c1c0e176ad1346674"} Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.085437 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.160131 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-libvirt-combined-ca-bundle\") pod \"c7230f65-7e9a-4455-8d25-c49393bfbafe\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.160174 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-ssh-key-openstack-edpm-ipam\") pod \"c7230f65-7e9a-4455-8d25-c49393bfbafe\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.160372 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-inventory\") pod \"c7230f65-7e9a-4455-8d25-c49393bfbafe\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.160437 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thspr\" (UniqueName: \"kubernetes.io/projected/c7230f65-7e9a-4455-8d25-c49393bfbafe-kube-api-access-thspr\") pod \"c7230f65-7e9a-4455-8d25-c49393bfbafe\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.160469 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-ceph\") pod \"c7230f65-7e9a-4455-8d25-c49393bfbafe\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.160505 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-libvirt-secret-0\") pod \"c7230f65-7e9a-4455-8d25-c49393bfbafe\" (UID: \"c7230f65-7e9a-4455-8d25-c49393bfbafe\") " Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.165923 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7230f65-7e9a-4455-8d25-c49393bfbafe-kube-api-access-thspr" (OuterVolumeSpecName: "kube-api-access-thspr") pod "c7230f65-7e9a-4455-8d25-c49393bfbafe" (UID: "c7230f65-7e9a-4455-8d25-c49393bfbafe"). InnerVolumeSpecName "kube-api-access-thspr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.166581 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "c7230f65-7e9a-4455-8d25-c49393bfbafe" (UID: "c7230f65-7e9a-4455-8d25-c49393bfbafe"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.168800 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-ceph" (OuterVolumeSpecName: "ceph") pod "c7230f65-7e9a-4455-8d25-c49393bfbafe" (UID: "c7230f65-7e9a-4455-8d25-c49393bfbafe"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.190279 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-inventory" (OuterVolumeSpecName: "inventory") pod "c7230f65-7e9a-4455-8d25-c49393bfbafe" (UID: "c7230f65-7e9a-4455-8d25-c49393bfbafe"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.190441 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c7230f65-7e9a-4455-8d25-c49393bfbafe" (UID: "c7230f65-7e9a-4455-8d25-c49393bfbafe"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.192572 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "c7230f65-7e9a-4455-8d25-c49393bfbafe" (UID: "c7230f65-7e9a-4455-8d25-c49393bfbafe"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.262589 4792 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.262624 4792 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.262637 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.262646 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.262654 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thspr\" (UniqueName: \"kubernetes.io/projected/c7230f65-7e9a-4455-8d25-c49393bfbafe-kube-api-access-thspr\") on node \"crc\" DevicePath \"\"" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.262664 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c7230f65-7e9a-4455-8d25-c49393bfbafe-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.724386 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" event={"ID":"c7230f65-7e9a-4455-8d25-c49393bfbafe","Type":"ContainerDied","Data":"49739f0dd600f78c2fce8ea6d1d733fa8e1503a99144161a871f4dba20c10897"} Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.724737 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49739f0dd600f78c2fce8ea6d1d733fa8e1503a99144161a871f4dba20c10897" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.724601 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.822446 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7"] Mar 01 09:57:46 crc kubenswrapper[4792]: E0301 09:57:46.822787 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7230f65-7e9a-4455-8d25-c49393bfbafe" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.822804 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7230f65-7e9a-4455-8d25-c49393bfbafe" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 01 09:57:46 crc kubenswrapper[4792]: E0301 09:57:46.822830 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b" containerName="extract-utilities" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.822838 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b" containerName="extract-utilities" Mar 01 09:57:46 crc kubenswrapper[4792]: E0301 09:57:46.822846 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b" containerName="extract-content" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.822852 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b" containerName="extract-content" Mar 01 09:57:46 crc kubenswrapper[4792]: E0301 09:57:46.822861 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b" containerName="registry-server" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.822867 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b" containerName="registry-server" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.823028 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b81ad78c-fa6b-4bd0-9ac4-edab0eefa25b" containerName="registry-server" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.823044 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7230f65-7e9a-4455-8d25-c49393bfbafe" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.823644 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.827064 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.827412 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.827638 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ceph-nova" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.828187 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vqr5" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.829486 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.829517 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.829650 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.829718 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.829947 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.844614 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7"] Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.877516 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.877564 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-3\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.877599 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.877660 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.877702 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.877722 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkzx5\" (UniqueName: \"kubernetes.io/projected/d7776778-c586-4ab6-8fdf-bfed4168992d-kube-api-access-qkzx5\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.877768 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.877793 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.877867 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.877887 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.877912 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.877928 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/d7776778-c586-4ab6-8fdf-bfed4168992d-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.878144 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-2\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.979882 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-2\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.979953 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.979979 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-3\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.980003 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.980025 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.980070 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.980087 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkzx5\" (UniqueName: \"kubernetes.io/projected/d7776778-c586-4ab6-8fdf-bfed4168992d-kube-api-access-qkzx5\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.980131 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.980155 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.980186 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.980202 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.980228 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.980245 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/d7776778-c586-4ab6-8fdf-bfed4168992d-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.981001 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/d7776778-c586-4ab6-8fdf-bfed4168992d-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.981217 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.984913 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-3\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.985127 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.985486 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.985674 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-2\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.986375 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.987551 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.987568 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.988986 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:46 crc kubenswrapper[4792]: I0301 09:57:46.989196 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:47 crc kubenswrapper[4792]: I0301 09:57:47.000818 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:47 crc kubenswrapper[4792]: I0301 09:57:47.006581 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkzx5\" (UniqueName: \"kubernetes.io/projected/d7776778-c586-4ab6-8fdf-bfed4168992d-kube-api-access-qkzx5\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:47 crc kubenswrapper[4792]: I0301 09:57:47.143142 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 09:57:47 crc kubenswrapper[4792]: I0301 09:57:47.659917 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7"] Mar 01 09:57:47 crc kubenswrapper[4792]: I0301 09:57:47.733311 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" event={"ID":"d7776778-c586-4ab6-8fdf-bfed4168992d","Type":"ContainerStarted","Data":"f6255f4f054d2fb3983c00b8a4caf1954fd5251d362befc32634842b245118e9"} Mar 01 09:57:48 crc kubenswrapper[4792]: I0301 09:57:48.742046 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" event={"ID":"d7776778-c586-4ab6-8fdf-bfed4168992d","Type":"ContainerStarted","Data":"136627f27d1451f409d8606c1096deb71cb0c8c3aa23573781151b161026a979"} Mar 01 09:57:48 crc kubenswrapper[4792]: I0301 09:57:48.764084 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" podStartSLOduration=2.287963233 podStartE2EDuration="2.76406713s" podCreationTimestamp="2026-03-01 09:57:46 +0000 UTC" firstStartedPulling="2026-03-01 09:57:47.665652303 +0000 UTC m=+2996.907531500" lastFinishedPulling="2026-03-01 09:57:48.1417562 +0000 UTC m=+2997.383635397" observedRunningTime="2026-03-01 09:57:48.758507622 +0000 UTC m=+2998.000386829" watchObservedRunningTime="2026-03-01 09:57:48.76406713 +0000 UTC m=+2998.005946327" Mar 01 09:58:00 crc kubenswrapper[4792]: I0301 09:58:00.146692 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539318-d9tmc"] Mar 01 09:58:00 crc kubenswrapper[4792]: I0301 09:58:00.149551 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539318-d9tmc" Mar 01 09:58:00 crc kubenswrapper[4792]: I0301 09:58:00.154617 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539318-d9tmc"] Mar 01 09:58:00 crc kubenswrapper[4792]: I0301 09:58:00.154766 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 09:58:00 crc kubenswrapper[4792]: I0301 09:58:00.154960 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 09:58:00 crc kubenswrapper[4792]: I0301 09:58:00.155033 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 09:58:00 crc kubenswrapper[4792]: I0301 09:58:00.237858 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sv8q\" (UniqueName: \"kubernetes.io/projected/bf3aad6a-fbd9-4a24-a489-33507709811b-kube-api-access-9sv8q\") pod \"auto-csr-approver-29539318-d9tmc\" (UID: \"bf3aad6a-fbd9-4a24-a489-33507709811b\") " pod="openshift-infra/auto-csr-approver-29539318-d9tmc" Mar 01 09:58:00 crc kubenswrapper[4792]: I0301 09:58:00.340111 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sv8q\" (UniqueName: \"kubernetes.io/projected/bf3aad6a-fbd9-4a24-a489-33507709811b-kube-api-access-9sv8q\") pod \"auto-csr-approver-29539318-d9tmc\" (UID: \"bf3aad6a-fbd9-4a24-a489-33507709811b\") " pod="openshift-infra/auto-csr-approver-29539318-d9tmc" Mar 01 09:58:00 crc kubenswrapper[4792]: I0301 09:58:00.359047 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sv8q\" (UniqueName: \"kubernetes.io/projected/bf3aad6a-fbd9-4a24-a489-33507709811b-kube-api-access-9sv8q\") pod \"auto-csr-approver-29539318-d9tmc\" (UID: \"bf3aad6a-fbd9-4a24-a489-33507709811b\") " pod="openshift-infra/auto-csr-approver-29539318-d9tmc" Mar 01 09:58:00 crc kubenswrapper[4792]: I0301 09:58:00.470289 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539318-d9tmc" Mar 01 09:58:00 crc kubenswrapper[4792]: I0301 09:58:00.922886 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539318-d9tmc"] Mar 01 09:58:01 crc kubenswrapper[4792]: I0301 09:58:01.846547 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539318-d9tmc" event={"ID":"bf3aad6a-fbd9-4a24-a489-33507709811b","Type":"ContainerStarted","Data":"9e03b1c59248ba09c1b8c07faed32e4559aa4ccbf524fc20f2afd8ba80f48cc9"} Mar 01 09:58:02 crc kubenswrapper[4792]: I0301 09:58:02.855953 4792 generic.go:334] "Generic (PLEG): container finished" podID="bf3aad6a-fbd9-4a24-a489-33507709811b" containerID="7152ca7878f74975420b6650bf54cd79c2b676e3ea865b2cbf55b92459e46fa6" exitCode=0 Mar 01 09:58:02 crc kubenswrapper[4792]: I0301 09:58:02.855999 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539318-d9tmc" event={"ID":"bf3aad6a-fbd9-4a24-a489-33507709811b","Type":"ContainerDied","Data":"7152ca7878f74975420b6650bf54cd79c2b676e3ea865b2cbf55b92459e46fa6"} Mar 01 09:58:04 crc kubenswrapper[4792]: I0301 09:58:04.247767 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539318-d9tmc" Mar 01 09:58:04 crc kubenswrapper[4792]: I0301 09:58:04.314040 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sv8q\" (UniqueName: \"kubernetes.io/projected/bf3aad6a-fbd9-4a24-a489-33507709811b-kube-api-access-9sv8q\") pod \"bf3aad6a-fbd9-4a24-a489-33507709811b\" (UID: \"bf3aad6a-fbd9-4a24-a489-33507709811b\") " Mar 01 09:58:04 crc kubenswrapper[4792]: I0301 09:58:04.330295 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf3aad6a-fbd9-4a24-a489-33507709811b-kube-api-access-9sv8q" (OuterVolumeSpecName: "kube-api-access-9sv8q") pod "bf3aad6a-fbd9-4a24-a489-33507709811b" (UID: "bf3aad6a-fbd9-4a24-a489-33507709811b"). InnerVolumeSpecName "kube-api-access-9sv8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 09:58:04 crc kubenswrapper[4792]: I0301 09:58:04.416881 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sv8q\" (UniqueName: \"kubernetes.io/projected/bf3aad6a-fbd9-4a24-a489-33507709811b-kube-api-access-9sv8q\") on node \"crc\" DevicePath \"\"" Mar 01 09:58:04 crc kubenswrapper[4792]: I0301 09:58:04.871236 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539318-d9tmc" event={"ID":"bf3aad6a-fbd9-4a24-a489-33507709811b","Type":"ContainerDied","Data":"9e03b1c59248ba09c1b8c07faed32e4559aa4ccbf524fc20f2afd8ba80f48cc9"} Mar 01 09:58:04 crc kubenswrapper[4792]: I0301 09:58:04.871274 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e03b1c59248ba09c1b8c07faed32e4559aa4ccbf524fc20f2afd8ba80f48cc9" Mar 01 09:58:04 crc kubenswrapper[4792]: I0301 09:58:04.871252 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539318-d9tmc" Mar 01 09:58:04 crc kubenswrapper[4792]: I0301 09:58:04.944366 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 09:58:04 crc kubenswrapper[4792]: I0301 09:58:04.944418 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 09:58:04 crc kubenswrapper[4792]: I0301 09:58:04.944464 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 09:58:04 crc kubenswrapper[4792]: I0301 09:58:04.945139 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0c8bbdbe3553a3ed4617103bbd379245337bfad8f18870faba13af9b3c14caa1"} pod="openshift-machine-config-operator/machine-config-daemon-bqszv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 01 09:58:04 crc kubenswrapper[4792]: I0301 09:58:04.945189 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" containerID="cri-o://0c8bbdbe3553a3ed4617103bbd379245337bfad8f18870faba13af9b3c14caa1" gracePeriod=600 Mar 01 09:58:05 crc kubenswrapper[4792]: I0301 09:58:05.308881 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539312-tpqkw"] Mar 01 09:58:05 crc kubenswrapper[4792]: I0301 09:58:05.316717 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539312-tpqkw"] Mar 01 09:58:05 crc kubenswrapper[4792]: I0301 09:58:05.417892 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11abc020-6c8a-4de3-8afc-229196293ab0" path="/var/lib/kubelet/pods/11abc020-6c8a-4de3-8afc-229196293ab0/volumes" Mar 01 09:58:05 crc kubenswrapper[4792]: I0301 09:58:05.880583 4792 generic.go:334] "Generic (PLEG): container finished" podID="9105f6b0-6f16-47aa-8009-73736a90b765" containerID="0c8bbdbe3553a3ed4617103bbd379245337bfad8f18870faba13af9b3c14caa1" exitCode=0 Mar 01 09:58:05 crc kubenswrapper[4792]: I0301 09:58:05.880650 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerDied","Data":"0c8bbdbe3553a3ed4617103bbd379245337bfad8f18870faba13af9b3c14caa1"} Mar 01 09:58:05 crc kubenswrapper[4792]: I0301 09:58:05.880968 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b"} Mar 01 09:58:05 crc kubenswrapper[4792]: I0301 09:58:05.880995 4792 scope.go:117] "RemoveContainer" containerID="f0bf547b226581a5fcf3e629c0669f542e2db0499818e26888166a11594cf983" Mar 01 09:58:31 crc kubenswrapper[4792]: I0301 09:58:31.704816 4792 scope.go:117] "RemoveContainer" containerID="89836f5e70f069d0d23a66a3f24b77f6002210b440a744a89543043e75793243" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.148718 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j"] Mar 01 10:00:00 crc kubenswrapper[4792]: E0301 10:00:00.149790 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf3aad6a-fbd9-4a24-a489-33507709811b" containerName="oc" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.149809 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf3aad6a-fbd9-4a24-a489-33507709811b" containerName="oc" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.150261 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf3aad6a-fbd9-4a24-a489-33507709811b" containerName="oc" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.151330 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.153829 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.154425 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.162015 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539320-bzqcr"] Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.163147 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539320-bzqcr" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.167801 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.167955 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.168085 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.170473 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539320-bzqcr"] Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.177422 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j"] Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.305953 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8730f988-7504-4e33-a0dc-406e4b21ca50-config-volume\") pod \"collect-profiles-29539320-dbp4j\" (UID: \"8730f988-7504-4e33-a0dc-406e4b21ca50\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.306108 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84gpt\" (UniqueName: \"kubernetes.io/projected/8730f988-7504-4e33-a0dc-406e4b21ca50-kube-api-access-84gpt\") pod \"collect-profiles-29539320-dbp4j\" (UID: \"8730f988-7504-4e33-a0dc-406e4b21ca50\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.306206 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8730f988-7504-4e33-a0dc-406e4b21ca50-secret-volume\") pod \"collect-profiles-29539320-dbp4j\" (UID: \"8730f988-7504-4e33-a0dc-406e4b21ca50\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.306226 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcgkf\" (UniqueName: \"kubernetes.io/projected/a5c085ec-b23e-4ad9-ae76-9775921b667d-kube-api-access-gcgkf\") pod \"auto-csr-approver-29539320-bzqcr\" (UID: \"a5c085ec-b23e-4ad9-ae76-9775921b667d\") " pod="openshift-infra/auto-csr-approver-29539320-bzqcr" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.408351 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8730f988-7504-4e33-a0dc-406e4b21ca50-secret-volume\") pod \"collect-profiles-29539320-dbp4j\" (UID: \"8730f988-7504-4e33-a0dc-406e4b21ca50\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.408430 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcgkf\" (UniqueName: \"kubernetes.io/projected/a5c085ec-b23e-4ad9-ae76-9775921b667d-kube-api-access-gcgkf\") pod \"auto-csr-approver-29539320-bzqcr\" (UID: \"a5c085ec-b23e-4ad9-ae76-9775921b667d\") " pod="openshift-infra/auto-csr-approver-29539320-bzqcr" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.408492 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8730f988-7504-4e33-a0dc-406e4b21ca50-config-volume\") pod \"collect-profiles-29539320-dbp4j\" (UID: \"8730f988-7504-4e33-a0dc-406e4b21ca50\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.408572 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84gpt\" (UniqueName: \"kubernetes.io/projected/8730f988-7504-4e33-a0dc-406e4b21ca50-kube-api-access-84gpt\") pod \"collect-profiles-29539320-dbp4j\" (UID: \"8730f988-7504-4e33-a0dc-406e4b21ca50\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.410049 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8730f988-7504-4e33-a0dc-406e4b21ca50-config-volume\") pod \"collect-profiles-29539320-dbp4j\" (UID: \"8730f988-7504-4e33-a0dc-406e4b21ca50\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.414038 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8730f988-7504-4e33-a0dc-406e4b21ca50-secret-volume\") pod \"collect-profiles-29539320-dbp4j\" (UID: \"8730f988-7504-4e33-a0dc-406e4b21ca50\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.428669 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcgkf\" (UniqueName: \"kubernetes.io/projected/a5c085ec-b23e-4ad9-ae76-9775921b667d-kube-api-access-gcgkf\") pod \"auto-csr-approver-29539320-bzqcr\" (UID: \"a5c085ec-b23e-4ad9-ae76-9775921b667d\") " pod="openshift-infra/auto-csr-approver-29539320-bzqcr" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.431700 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84gpt\" (UniqueName: \"kubernetes.io/projected/8730f988-7504-4e33-a0dc-406e4b21ca50-kube-api-access-84gpt\") pod \"collect-profiles-29539320-dbp4j\" (UID: \"8730f988-7504-4e33-a0dc-406e4b21ca50\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.522241 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.532706 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539320-bzqcr" Mar 01 10:00:00 crc kubenswrapper[4792]: I0301 10:00:00.983197 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j"] Mar 01 10:00:01 crc kubenswrapper[4792]: I0301 10:00:01.057566 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539320-bzqcr"] Mar 01 10:00:01 crc kubenswrapper[4792]: I0301 10:00:01.809621 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539320-bzqcr" event={"ID":"a5c085ec-b23e-4ad9-ae76-9775921b667d","Type":"ContainerStarted","Data":"6950cd079a11f0e71e0a4f9c432b59a859efd91bdf75b8459961647862c328b1"} Mar 01 10:00:01 crc kubenswrapper[4792]: I0301 10:00:01.811954 4792 generic.go:334] "Generic (PLEG): container finished" podID="8730f988-7504-4e33-a0dc-406e4b21ca50" containerID="f2d004ec429ab6ffe82899f882bc50441e9749ff1c56e27e78714fcc65b133e4" exitCode=0 Mar 01 10:00:01 crc kubenswrapper[4792]: I0301 10:00:01.811979 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j" event={"ID":"8730f988-7504-4e33-a0dc-406e4b21ca50","Type":"ContainerDied","Data":"f2d004ec429ab6ffe82899f882bc50441e9749ff1c56e27e78714fcc65b133e4"} Mar 01 10:00:01 crc kubenswrapper[4792]: I0301 10:00:01.811992 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j" event={"ID":"8730f988-7504-4e33-a0dc-406e4b21ca50","Type":"ContainerStarted","Data":"81e84ed6319131ce2cccc0987ee94a63105b632fd5d0ca856c2abe3c989cb70b"} Mar 01 10:00:03 crc kubenswrapper[4792]: I0301 10:00:03.131476 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j" Mar 01 10:00:03 crc kubenswrapper[4792]: I0301 10:00:03.170420 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8730f988-7504-4e33-a0dc-406e4b21ca50-secret-volume\") pod \"8730f988-7504-4e33-a0dc-406e4b21ca50\" (UID: \"8730f988-7504-4e33-a0dc-406e4b21ca50\") " Mar 01 10:00:03 crc kubenswrapper[4792]: I0301 10:00:03.170503 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8730f988-7504-4e33-a0dc-406e4b21ca50-config-volume\") pod \"8730f988-7504-4e33-a0dc-406e4b21ca50\" (UID: \"8730f988-7504-4e33-a0dc-406e4b21ca50\") " Mar 01 10:00:03 crc kubenswrapper[4792]: I0301 10:00:03.170673 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84gpt\" (UniqueName: \"kubernetes.io/projected/8730f988-7504-4e33-a0dc-406e4b21ca50-kube-api-access-84gpt\") pod \"8730f988-7504-4e33-a0dc-406e4b21ca50\" (UID: \"8730f988-7504-4e33-a0dc-406e4b21ca50\") " Mar 01 10:00:03 crc kubenswrapper[4792]: I0301 10:00:03.171478 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8730f988-7504-4e33-a0dc-406e4b21ca50-config-volume" (OuterVolumeSpecName: "config-volume") pod "8730f988-7504-4e33-a0dc-406e4b21ca50" (UID: "8730f988-7504-4e33-a0dc-406e4b21ca50"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:00:03 crc kubenswrapper[4792]: I0301 10:00:03.179361 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8730f988-7504-4e33-a0dc-406e4b21ca50-kube-api-access-84gpt" (OuterVolumeSpecName: "kube-api-access-84gpt") pod "8730f988-7504-4e33-a0dc-406e4b21ca50" (UID: "8730f988-7504-4e33-a0dc-406e4b21ca50"). InnerVolumeSpecName "kube-api-access-84gpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:00:03 crc kubenswrapper[4792]: I0301 10:00:03.181034 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8730f988-7504-4e33-a0dc-406e4b21ca50-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8730f988-7504-4e33-a0dc-406e4b21ca50" (UID: "8730f988-7504-4e33-a0dc-406e4b21ca50"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:03 crc kubenswrapper[4792]: I0301 10:00:03.272557 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84gpt\" (UniqueName: \"kubernetes.io/projected/8730f988-7504-4e33-a0dc-406e4b21ca50-kube-api-access-84gpt\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:03 crc kubenswrapper[4792]: I0301 10:00:03.272584 4792 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8730f988-7504-4e33-a0dc-406e4b21ca50-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:03 crc kubenswrapper[4792]: I0301 10:00:03.272595 4792 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8730f988-7504-4e33-a0dc-406e4b21ca50-config-volume\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:03 crc kubenswrapper[4792]: I0301 10:00:03.830383 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j" event={"ID":"8730f988-7504-4e33-a0dc-406e4b21ca50","Type":"ContainerDied","Data":"81e84ed6319131ce2cccc0987ee94a63105b632fd5d0ca856c2abe3c989cb70b"} Mar 01 10:00:03 crc kubenswrapper[4792]: I0301 10:00:03.830420 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81e84ed6319131ce2cccc0987ee94a63105b632fd5d0ca856c2abe3c989cb70b" Mar 01 10:00:03 crc kubenswrapper[4792]: I0301 10:00:03.830436 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539320-dbp4j" Mar 01 10:00:04 crc kubenswrapper[4792]: I0301 10:00:04.214594 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4"] Mar 01 10:00:04 crc kubenswrapper[4792]: I0301 10:00:04.224116 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539275-5mhm4"] Mar 01 10:00:05 crc kubenswrapper[4792]: I0301 10:00:05.422682 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8558286c-6cb2-4061-bb84-07803d33b576" path="/var/lib/kubelet/pods/8558286c-6cb2-4061-bb84-07803d33b576/volumes" Mar 01 10:00:18 crc kubenswrapper[4792]: I0301 10:00:18.959447 4792 generic.go:334] "Generic (PLEG): container finished" podID="a5c085ec-b23e-4ad9-ae76-9775921b667d" containerID="4994cd62771f4057b6d1f58071d3828ce7f1350994c490b87bf0e5cb1e97dff9" exitCode=0 Mar 01 10:00:18 crc kubenswrapper[4792]: I0301 10:00:18.959560 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539320-bzqcr" event={"ID":"a5c085ec-b23e-4ad9-ae76-9775921b667d","Type":"ContainerDied","Data":"4994cd62771f4057b6d1f58071d3828ce7f1350994c490b87bf0e5cb1e97dff9"} Mar 01 10:00:20 crc kubenswrapper[4792]: I0301 10:00:20.266830 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539320-bzqcr" Mar 01 10:00:20 crc kubenswrapper[4792]: I0301 10:00:20.374143 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcgkf\" (UniqueName: \"kubernetes.io/projected/a5c085ec-b23e-4ad9-ae76-9775921b667d-kube-api-access-gcgkf\") pod \"a5c085ec-b23e-4ad9-ae76-9775921b667d\" (UID: \"a5c085ec-b23e-4ad9-ae76-9775921b667d\") " Mar 01 10:00:20 crc kubenswrapper[4792]: I0301 10:00:20.391097 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5c085ec-b23e-4ad9-ae76-9775921b667d-kube-api-access-gcgkf" (OuterVolumeSpecName: "kube-api-access-gcgkf") pod "a5c085ec-b23e-4ad9-ae76-9775921b667d" (UID: "a5c085ec-b23e-4ad9-ae76-9775921b667d"). InnerVolumeSpecName "kube-api-access-gcgkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:00:20 crc kubenswrapper[4792]: I0301 10:00:20.476675 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcgkf\" (UniqueName: \"kubernetes.io/projected/a5c085ec-b23e-4ad9-ae76-9775921b667d-kube-api-access-gcgkf\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:20 crc kubenswrapper[4792]: I0301 10:00:20.979394 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539320-bzqcr" event={"ID":"a5c085ec-b23e-4ad9-ae76-9775921b667d","Type":"ContainerDied","Data":"6950cd079a11f0e71e0a4f9c432b59a859efd91bdf75b8459961647862c328b1"} Mar 01 10:00:20 crc kubenswrapper[4792]: I0301 10:00:20.979460 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6950cd079a11f0e71e0a4f9c432b59a859efd91bdf75b8459961647862c328b1" Mar 01 10:00:20 crc kubenswrapper[4792]: I0301 10:00:20.979533 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539320-bzqcr" Mar 01 10:00:21 crc kubenswrapper[4792]: E0301 10:00:21.127868 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5c085ec_b23e_4ad9_ae76_9775921b667d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5c085ec_b23e_4ad9_ae76_9775921b667d.slice/crio-6950cd079a11f0e71e0a4f9c432b59a859efd91bdf75b8459961647862c328b1\": RecentStats: unable to find data in memory cache]" Mar 01 10:00:21 crc kubenswrapper[4792]: I0301 10:00:21.320824 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539314-9brl7"] Mar 01 10:00:21 crc kubenswrapper[4792]: I0301 10:00:21.330347 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539314-9brl7"] Mar 01 10:00:21 crc kubenswrapper[4792]: I0301 10:00:21.426631 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d633fb4c-b1e3-463f-af0a-2891b7130fc0" path="/var/lib/kubelet/pods/d633fb4c-b1e3-463f-af0a-2891b7130fc0/volumes" Mar 01 10:00:31 crc kubenswrapper[4792]: I0301 10:00:31.797495 4792 scope.go:117] "RemoveContainer" containerID="25d35f30a0bda8efbd3c0227d7f47d3a25118a249c78dad70cafb27c52068acf" Mar 01 10:00:31 crc kubenswrapper[4792]: I0301 10:00:31.822266 4792 scope.go:117] "RemoveContainer" containerID="19ca062443b337d8791859ab02de766e48126cb99d1f720dcaa520cb4be8f904" Mar 01 10:00:34 crc kubenswrapper[4792]: I0301 10:00:34.090633 4792 generic.go:334] "Generic (PLEG): container finished" podID="d7776778-c586-4ab6-8fdf-bfed4168992d" containerID="136627f27d1451f409d8606c1096deb71cb0c8c3aa23573781151b161026a979" exitCode=0 Mar 01 10:00:34 crc kubenswrapper[4792]: I0301 10:00:34.090677 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" event={"ID":"d7776778-c586-4ab6-8fdf-bfed4168992d","Type":"ContainerDied","Data":"136627f27d1451f409d8606c1096deb71cb0c8c3aa23573781151b161026a979"} Mar 01 10:00:34 crc kubenswrapper[4792]: I0301 10:00:34.942634 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:00:34 crc kubenswrapper[4792]: I0301 10:00:34.942963 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.467865 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.645419 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkzx5\" (UniqueName: \"kubernetes.io/projected/d7776778-c586-4ab6-8fdf-bfed4168992d-kube-api-access-qkzx5\") pod \"d7776778-c586-4ab6-8fdf-bfed4168992d\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.645488 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-ssh-key-openstack-edpm-ipam\") pod \"d7776778-c586-4ab6-8fdf-bfed4168992d\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.645513 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-ceph\") pod \"d7776778-c586-4ab6-8fdf-bfed4168992d\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.645555 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-extra-config-0\") pod \"d7776778-c586-4ab6-8fdf-bfed4168992d\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.645601 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-inventory\") pod \"d7776778-c586-4ab6-8fdf-bfed4168992d\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.645685 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-custom-ceph-combined-ca-bundle\") pod \"d7776778-c586-4ab6-8fdf-bfed4168992d\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.645722 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-migration-ssh-key-0\") pod \"d7776778-c586-4ab6-8fdf-bfed4168992d\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.645766 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-2\") pod \"d7776778-c586-4ab6-8fdf-bfed4168992d\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.645832 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-0\") pod \"d7776778-c586-4ab6-8fdf-bfed4168992d\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.645875 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-1\") pod \"d7776778-c586-4ab6-8fdf-bfed4168992d\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.645948 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-3\") pod \"d7776778-c586-4ab6-8fdf-bfed4168992d\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.646066 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-migration-ssh-key-1\") pod \"d7776778-c586-4ab6-8fdf-bfed4168992d\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.646113 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/d7776778-c586-4ab6-8fdf-bfed4168992d-ceph-nova-0\") pod \"d7776778-c586-4ab6-8fdf-bfed4168992d\" (UID: \"d7776778-c586-4ab6-8fdf-bfed4168992d\") " Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.660341 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-custom-ceph-combined-ca-bundle" (OuterVolumeSpecName: "nova-custom-ceph-combined-ca-bundle") pod "d7776778-c586-4ab6-8fdf-bfed4168992d" (UID: "d7776778-c586-4ab6-8fdf-bfed4168992d"). InnerVolumeSpecName "nova-custom-ceph-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.660398 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-ceph" (OuterVolumeSpecName: "ceph") pod "d7776778-c586-4ab6-8fdf-bfed4168992d" (UID: "d7776778-c586-4ab6-8fdf-bfed4168992d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.667721 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7776778-c586-4ab6-8fdf-bfed4168992d-kube-api-access-qkzx5" (OuterVolumeSpecName: "kube-api-access-qkzx5") pod "d7776778-c586-4ab6-8fdf-bfed4168992d" (UID: "d7776778-c586-4ab6-8fdf-bfed4168992d"). InnerVolumeSpecName "kube-api-access-qkzx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.674416 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "d7776778-c586-4ab6-8fdf-bfed4168992d" (UID: "d7776778-c586-4ab6-8fdf-bfed4168992d"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.674864 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "d7776778-c586-4ab6-8fdf-bfed4168992d" (UID: "d7776778-c586-4ab6-8fdf-bfed4168992d"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.686282 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d7776778-c586-4ab6-8fdf-bfed4168992d" (UID: "d7776778-c586-4ab6-8fdf-bfed4168992d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.690773 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "d7776778-c586-4ab6-8fdf-bfed4168992d" (UID: "d7776778-c586-4ab6-8fdf-bfed4168992d"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.695087 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-inventory" (OuterVolumeSpecName: "inventory") pod "d7776778-c586-4ab6-8fdf-bfed4168992d" (UID: "d7776778-c586-4ab6-8fdf-bfed4168992d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.695130 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "d7776778-c586-4ab6-8fdf-bfed4168992d" (UID: "d7776778-c586-4ab6-8fdf-bfed4168992d"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.701877 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7776778-c586-4ab6-8fdf-bfed4168992d-ceph-nova-0" (OuterVolumeSpecName: "ceph-nova-0") pod "d7776778-c586-4ab6-8fdf-bfed4168992d" (UID: "d7776778-c586-4ab6-8fdf-bfed4168992d"). InnerVolumeSpecName "ceph-nova-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.703619 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "d7776778-c586-4ab6-8fdf-bfed4168992d" (UID: "d7776778-c586-4ab6-8fdf-bfed4168992d"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.705643 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "d7776778-c586-4ab6-8fdf-bfed4168992d" (UID: "d7776778-c586-4ab6-8fdf-bfed4168992d"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.714065 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "d7776778-c586-4ab6-8fdf-bfed4168992d" (UID: "d7776778-c586-4ab6-8fdf-bfed4168992d"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.750945 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkzx5\" (UniqueName: \"kubernetes.io/projected/d7776778-c586-4ab6-8fdf-bfed4168992d-kube-api-access-qkzx5\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.750972 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.750984 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.750996 4792 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.751005 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-inventory\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.751014 4792 reconciler_common.go:293] "Volume detached for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-custom-ceph-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.751024 4792 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.751032 4792 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.751041 4792 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.751049 4792 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.751057 4792 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.751066 4792 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d7776778-c586-4ab6-8fdf-bfed4168992d-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:35 crc kubenswrapper[4792]: I0301 10:00:35.751074 4792 reconciler_common.go:293] "Volume detached for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/d7776778-c586-4ab6-8fdf-bfed4168992d-ceph-nova-0\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:36 crc kubenswrapper[4792]: I0301 10:00:36.109041 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" event={"ID":"d7776778-c586-4ab6-8fdf-bfed4168992d","Type":"ContainerDied","Data":"f6255f4f054d2fb3983c00b8a4caf1954fd5251d362befc32634842b245118e9"} Mar 01 10:00:36 crc kubenswrapper[4792]: I0301 10:00:36.109323 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6255f4f054d2fb3983c00b8a4caf1954fd5251d362befc32634842b245118e9" Mar 01 10:00:36 crc kubenswrapper[4792]: I0301 10:00:36.109104 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.242682 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 01 10:00:50 crc kubenswrapper[4792]: E0301 10:00:50.243560 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8730f988-7504-4e33-a0dc-406e4b21ca50" containerName="collect-profiles" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.243571 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8730f988-7504-4e33-a0dc-406e4b21ca50" containerName="collect-profiles" Mar 01 10:00:50 crc kubenswrapper[4792]: E0301 10:00:50.243591 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7776778-c586-4ab6-8fdf-bfed4168992d" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.243599 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7776778-c586-4ab6-8fdf-bfed4168992d" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Mar 01 10:00:50 crc kubenswrapper[4792]: E0301 10:00:50.243614 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5c085ec-b23e-4ad9-ae76-9775921b667d" containerName="oc" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.243621 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5c085ec-b23e-4ad9-ae76-9775921b667d" containerName="oc" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.243805 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5c085ec-b23e-4ad9-ae76-9775921b667d" containerName="oc" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.243832 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="8730f988-7504-4e33-a0dc-406e4b21ca50" containerName="collect-profiles" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.243848 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7776778-c586-4ab6-8fdf-bfed4168992d" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.244959 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.249478 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.249762 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.253741 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.258697 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.261597 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.281298 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.288065 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.373843 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.373887 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23d15722-3d0f-44ce-ac55-eba67760f0e9-config-data\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.373922 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.373943 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-dev\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.373962 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.373978 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23d15722-3d0f-44ce-ac55-eba67760f0e9-scripts\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.373994 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhtmd\" (UniqueName: \"kubernetes.io/projected/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-kube-api-access-rhtmd\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.374587 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmq67\" (UniqueName: \"kubernetes.io/projected/23d15722-3d0f-44ce-ac55-eba67760f0e9-kube-api-access-qmq67\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.374633 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.374655 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.374679 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-sys\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.374712 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.374735 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-lib-modules\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.374754 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.374800 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-etc-nvme\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.374837 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23d15722-3d0f-44ce-ac55-eba67760f0e9-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.374865 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23d15722-3d0f-44ce-ac55-eba67760f0e9-config-data-custom\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.374889 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.374929 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-sys\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.374946 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.374970 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-run\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.374994 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.375025 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-dev\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.375045 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.375066 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.375109 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/23d15722-3d0f-44ce-ac55-eba67760f0e9-ceph\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.375130 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-run\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.375145 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.375163 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.375235 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.375354 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.375409 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477245 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477319 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/23d15722-3d0f-44ce-ac55-eba67760f0e9-ceph\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477343 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477370 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-run\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477388 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477408 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477427 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477445 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477464 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477478 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477576 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23d15722-3d0f-44ce-ac55-eba67760f0e9-config-data\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477608 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477632 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-dev\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477657 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23d15722-3d0f-44ce-ac55-eba67760f0e9-scripts\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477677 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477702 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhtmd\" (UniqueName: \"kubernetes.io/projected/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-kube-api-access-rhtmd\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477725 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477757 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-run\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477767 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmq67\" (UniqueName: \"kubernetes.io/projected/23d15722-3d0f-44ce-ac55-eba67760f0e9-kube-api-access-qmq67\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477797 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477800 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477839 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477859 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-sys\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477882 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477916 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-lib-modules\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477933 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.477962 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-etc-nvme\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478003 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23d15722-3d0f-44ce-ac55-eba67760f0e9-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478065 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23d15722-3d0f-44ce-ac55-eba67760f0e9-config-data-custom\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478085 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478106 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-sys\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478124 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478398 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-run\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478429 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478446 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-dev\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478463 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478491 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478477 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478463 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478653 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478647 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-dev\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478715 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-run\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478303 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-lib-modules\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478337 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478737 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.479308 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.479330 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-dev\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.479367 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.479388 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-sys\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.478611 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-etc-nvme\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.479527 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-sys\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.479587 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/23d15722-3d0f-44ce-ac55-eba67760f0e9-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.492130 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23d15722-3d0f-44ce-ac55-eba67760f0e9-scripts\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.493228 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.498814 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23d15722-3d0f-44ce-ac55-eba67760f0e9-config-data-custom\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.498855 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23d15722-3d0f-44ce-ac55-eba67760f0e9-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.499341 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.499415 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.500456 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/23d15722-3d0f-44ce-ac55-eba67760f0e9-ceph\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.500919 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23d15722-3d0f-44ce-ac55-eba67760f0e9-config-data\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.502201 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.503054 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.507346 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmq67\" (UniqueName: \"kubernetes.io/projected/23d15722-3d0f-44ce-ac55-eba67760f0e9-kube-api-access-qmq67\") pod \"cinder-backup-0\" (UID: \"23d15722-3d0f-44ce-ac55-eba67760f0e9\") " pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.511690 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhtmd\" (UniqueName: \"kubernetes.io/projected/d3ca4743-fa6c-4e2e-b2c8-b2362f44a727-kube-api-access-rhtmd\") pod \"cinder-volume-volume1-0\" (UID: \"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727\") " pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.566676 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.578490 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.942881 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-svb22"] Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.945598 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-svb22" Mar 01 10:00:50 crc kubenswrapper[4792]: I0301 10:00:50.957595 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-svb22"] Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.047776 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-fa75-account-create-update-s2kng"] Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.048996 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-fa75-account-create-update-s2kng" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.063299 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.078674 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-fa75-account-create-update-s2kng"] Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.089048 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7736148-bc12-4621-a1d2-efc4a0143b42-operator-scripts\") pod \"manila-db-create-svb22\" (UID: \"c7736148-bc12-4621-a1d2-efc4a0143b42\") " pod="openstack/manila-db-create-svb22" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.089118 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62fr5\" (UniqueName: \"kubernetes.io/projected/c7736148-bc12-4621-a1d2-efc4a0143b42-kube-api-access-62fr5\") pod \"manila-db-create-svb22\" (UID: \"c7736148-bc12-4621-a1d2-efc4a0143b42\") " pod="openstack/manila-db-create-svb22" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.094129 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5887d74897-rnlz9"] Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.095623 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.102595 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-c29rk" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.102702 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.103030 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.103186 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.151004 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.152576 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.157365 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.157740 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vwjrh" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.157942 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.159115 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.167008 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5887d74897-rnlz9"] Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.186557 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.190952 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-scripts\") pod \"horizon-5887d74897-rnlz9\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.191031 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0441b486-847a-4f32-8df2-a1284f39ee5d-operator-scripts\") pod \"manila-fa75-account-create-update-s2kng\" (UID: \"0441b486-847a-4f32-8df2-a1284f39ee5d\") " pod="openstack/manila-fa75-account-create-update-s2kng" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.191076 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbq5q\" (UniqueName: \"kubernetes.io/projected/0441b486-847a-4f32-8df2-a1284f39ee5d-kube-api-access-jbq5q\") pod \"manila-fa75-account-create-update-s2kng\" (UID: \"0441b486-847a-4f32-8df2-a1284f39ee5d\") " pod="openstack/manila-fa75-account-create-update-s2kng" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.191101 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7736148-bc12-4621-a1d2-efc4a0143b42-operator-scripts\") pod \"manila-db-create-svb22\" (UID: \"c7736148-bc12-4621-a1d2-efc4a0143b42\") " pod="openstack/manila-db-create-svb22" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.191144 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62fr5\" (UniqueName: \"kubernetes.io/projected/c7736148-bc12-4621-a1d2-efc4a0143b42-kube-api-access-62fr5\") pod \"manila-db-create-svb22\" (UID: \"c7736148-bc12-4621-a1d2-efc4a0143b42\") " pod="openstack/manila-db-create-svb22" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.191172 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-config-data\") pod \"horizon-5887d74897-rnlz9\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.191203 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pf4f\" (UniqueName: \"kubernetes.io/projected/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-kube-api-access-2pf4f\") pod \"horizon-5887d74897-rnlz9\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.191231 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-logs\") pod \"horizon-5887d74897-rnlz9\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.191252 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-horizon-secret-key\") pod \"horizon-5887d74897-rnlz9\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.196157 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7736148-bc12-4621-a1d2-efc4a0143b42-operator-scripts\") pod \"manila-db-create-svb22\" (UID: \"c7736148-bc12-4621-a1d2-efc4a0143b42\") " pod="openstack/manila-db-create-svb22" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.214240 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.215939 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.222812 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.223057 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.240585 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.248339 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.249231 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62fr5\" (UniqueName: \"kubernetes.io/projected/c7736148-bc12-4621-a1d2-efc4a0143b42-kube-api-access-62fr5\") pod \"manila-db-create-svb22\" (UID: \"c7736148-bc12-4621-a1d2-efc4a0143b42\") " pod="openstack/manila-db-create-svb22" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.279697 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-svb22" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.293943 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-scripts\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.294017 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.294051 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-config-data\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.294078 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0441b486-847a-4f32-8df2-a1284f39ee5d-operator-scripts\") pod \"manila-fa75-account-create-update-s2kng\" (UID: \"0441b486-847a-4f32-8df2-a1284f39ee5d\") " pod="openstack/manila-fa75-account-create-update-s2kng" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.294132 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbq5q\" (UniqueName: \"kubernetes.io/projected/0441b486-847a-4f32-8df2-a1284f39ee5d-kube-api-access-jbq5q\") pod \"manila-fa75-account-create-update-s2kng\" (UID: \"0441b486-847a-4f32-8df2-a1284f39ee5d\") " pod="openstack/manila-fa75-account-create-update-s2kng" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.294168 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.294221 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.294243 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbxxb\" (UniqueName: \"kubernetes.io/projected/316eb94f-166a-4fa2-99b8-8967503eba43-kube-api-access-pbxxb\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.294285 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/316eb94f-166a-4fa2-99b8-8967503eba43-ceph\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.294306 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-config-data\") pod \"horizon-5887d74897-rnlz9\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.294355 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pf4f\" (UniqueName: \"kubernetes.io/projected/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-kube-api-access-2pf4f\") pod \"horizon-5887d74897-rnlz9\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.294377 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/316eb94f-166a-4fa2-99b8-8967503eba43-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.294415 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-logs\") pod \"horizon-5887d74897-rnlz9\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.294441 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-horizon-secret-key\") pod \"horizon-5887d74897-rnlz9\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.294493 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-scripts\") pod \"horizon-5887d74897-rnlz9\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.294515 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/316eb94f-166a-4fa2-99b8-8967503eba43-logs\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.301028 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0441b486-847a-4f32-8df2-a1284f39ee5d-operator-scripts\") pod \"manila-fa75-account-create-update-s2kng\" (UID: \"0441b486-847a-4f32-8df2-a1284f39ee5d\") " pod="openstack/manila-fa75-account-create-update-s2kng" Mar 01 10:00:51 crc kubenswrapper[4792]: E0301 10:00:51.302285 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceph combined-ca-bundle config-data glance httpd-run internal-tls-certs kube-api-access-lxswf logs scripts], unattached volumes=[], failed to process volumes=[ceph combined-ca-bundle config-data glance httpd-run internal-tls-certs kube-api-access-lxswf logs scripts]: context canceled" pod="openstack/glance-default-internal-api-0" podUID="6c25331e-14fb-47d2-aa34-d84b133255e3" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.302999 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-scripts\") pod \"horizon-5887d74897-rnlz9\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.303432 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-logs\") pod \"horizon-5887d74897-rnlz9\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.307263 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-config-data\") pod \"horizon-5887d74897-rnlz9\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.316106 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-horizon-secret-key\") pod \"horizon-5887d74897-rnlz9\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.332477 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbq5q\" (UniqueName: \"kubernetes.io/projected/0441b486-847a-4f32-8df2-a1284f39ee5d-kube-api-access-jbq5q\") pod \"manila-fa75-account-create-update-s2kng\" (UID: \"0441b486-847a-4f32-8df2-a1284f39ee5d\") " pod="openstack/manila-fa75-account-create-update-s2kng" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.348742 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 01 10:00:51 crc kubenswrapper[4792]: E0301 10:00:51.349445 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceph combined-ca-bundle config-data glance httpd-run kube-api-access-pbxxb logs public-tls-certs scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-default-external-api-0" podUID="316eb94f-166a-4fa2-99b8-8967503eba43" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.365376 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pf4f\" (UniqueName: \"kubernetes.io/projected/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-kube-api-access-2pf4f\") pod \"horizon-5887d74897-rnlz9\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.371593 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5996ddfbb9-drpwn"] Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.373996 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.399319 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbxxb\" (UniqueName: \"kubernetes.io/projected/316eb94f-166a-4fa2-99b8-8967503eba43-kube-api-access-pbxxb\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.399371 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6c25331e-14fb-47d2-aa34-d84b133255e3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.399431 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.399460 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c25331e-14fb-47d2-aa34-d84b133255e3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.399508 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.399542 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/316eb94f-166a-4fa2-99b8-8967503eba43-ceph\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.399623 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/316eb94f-166a-4fa2-99b8-8967503eba43-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.399642 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c25331e-14fb-47d2-aa34-d84b133255e3-logs\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.399675 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.399747 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.399804 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/316eb94f-166a-4fa2-99b8-8967503eba43-logs\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.399842 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-scripts\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.399929 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.399965 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-config-data\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.400010 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.400083 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxswf\" (UniqueName: \"kubernetes.io/projected/6c25331e-14fb-47d2-aa34-d84b133255e3-kube-api-access-lxswf\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.400154 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.400215 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.410215 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/316eb94f-166a-4fa2-99b8-8967503eba43-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.410280 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.410588 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/316eb94f-166a-4fa2-99b8-8967503eba43-logs\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.410866 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.415562 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-fa75-account-create-update-s2kng" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.444712 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.444991 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.445233 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.445401 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-c29rk" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.453164 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/316eb94f-166a-4fa2-99b8-8967503eba43-ceph\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.457184 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.463076 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbxxb\" (UniqueName: \"kubernetes.io/projected/316eb94f-166a-4fa2-99b8-8967503eba43-kube-api-access-pbxxb\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.463568 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.472794 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-config-data\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.477934 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-scripts\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.477970 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.501825 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7026175e-efaf-497a-aaf1-079f2811ad08-logs\") pod \"horizon-5996ddfbb9-drpwn\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.501870 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.501973 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcmd2\" (UniqueName: \"kubernetes.io/projected/7026175e-efaf-497a-aaf1-079f2811ad08-kube-api-access-zcmd2\") pod \"horizon-5996ddfbb9-drpwn\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.502003 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7026175e-efaf-497a-aaf1-079f2811ad08-scripts\") pod \"horizon-5996ddfbb9-drpwn\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.502042 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.502313 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxswf\" (UniqueName: \"kubernetes.io/projected/6c25331e-14fb-47d2-aa34-d84b133255e3-kube-api-access-lxswf\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.502351 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7026175e-efaf-497a-aaf1-079f2811ad08-horizon-secret-key\") pod \"horizon-5996ddfbb9-drpwn\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.502379 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6c25331e-14fb-47d2-aa34-d84b133255e3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.502396 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c25331e-14fb-47d2-aa34-d84b133255e3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.502413 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.502437 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.502488 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c25331e-14fb-47d2-aa34-d84b133255e3-logs\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.502512 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7026175e-efaf-497a-aaf1-079f2811ad08-config-data\") pod \"horizon-5996ddfbb9-drpwn\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.502529 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.503128 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.503698 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c25331e-14fb-47d2-aa34-d84b133255e3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.503766 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c25331e-14fb-47d2-aa34-d84b133255e3-logs\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.504846 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.507709 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.521693 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.522421 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.531177 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.535419 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6c25331e-14fb-47d2-aa34-d84b133255e3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.543122 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.543798 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.549686 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxswf\" (UniqueName: \"kubernetes.io/projected/6c25331e-14fb-47d2-aa34-d84b133255e3-kube-api-access-lxswf\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.558171 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.614222 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7026175e-efaf-497a-aaf1-079f2811ad08-config-data\") pod \"horizon-5996ddfbb9-drpwn\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.614299 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7026175e-efaf-497a-aaf1-079f2811ad08-logs\") pod \"horizon-5996ddfbb9-drpwn\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.614361 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcmd2\" (UniqueName: \"kubernetes.io/projected/7026175e-efaf-497a-aaf1-079f2811ad08-kube-api-access-zcmd2\") pod \"horizon-5996ddfbb9-drpwn\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.614391 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7026175e-efaf-497a-aaf1-079f2811ad08-scripts\") pod \"horizon-5996ddfbb9-drpwn\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.614462 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7026175e-efaf-497a-aaf1-079f2811ad08-horizon-secret-key\") pod \"horizon-5996ddfbb9-drpwn\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.617747 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7026175e-efaf-497a-aaf1-079f2811ad08-config-data\") pod \"horizon-5996ddfbb9-drpwn\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.618394 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7026175e-efaf-497a-aaf1-079f2811ad08-logs\") pod \"horizon-5996ddfbb9-drpwn\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.619139 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5996ddfbb9-drpwn"] Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.624924 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7026175e-efaf-497a-aaf1-079f2811ad08-scripts\") pod \"horizon-5996ddfbb9-drpwn\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.641381 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcmd2\" (UniqueName: \"kubernetes.io/projected/7026175e-efaf-497a-aaf1-079f2811ad08-kube-api-access-zcmd2\") pod \"horizon-5996ddfbb9-drpwn\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.648139 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7026175e-efaf-497a-aaf1-079f2811ad08-horizon-secret-key\") pod \"horizon-5996ddfbb9-drpwn\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:00:51 crc kubenswrapper[4792]: I0301 10:00:51.724401 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.080806 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.179543 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-svb22"] Mar 01 10:00:52 crc kubenswrapper[4792]: W0301 10:00:52.246301 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7736148_bc12_4621_a1d2_efc4a0143b42.slice/crio-0f8fca8a39e93a444c277a56b4d25c74d8a2b8d4dd4ac5a12e9ce102355a7e5c WatchSource:0}: Error finding container 0f8fca8a39e93a444c277a56b4d25c74d8a2b8d4dd4ac5a12e9ce102355a7e5c: Status 404 returned error can't find the container with id 0f8fca8a39e93a444c277a56b4d25c74d8a2b8d4dd4ac5a12e9ce102355a7e5c Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.284787 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"23d15722-3d0f-44ce-ac55-eba67760f0e9","Type":"ContainerStarted","Data":"7d8b2f2c199c4c39a20036ac6f44cb2e87ed37faba8d8865f674bc98eb31c240"} Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.288872 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-svb22" event={"ID":"c7736148-bc12-4621-a1d2-efc4a0143b42","Type":"ContainerStarted","Data":"0f8fca8a39e93a444c277a56b4d25c74d8a2b8d4dd4ac5a12e9ce102355a7e5c"} Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.298161 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.298810 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727","Type":"ContainerStarted","Data":"097dfe01451643cae3dafae5e153076ff26e6d6a3d8864e96d7db2cb52ad425a"} Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.298862 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.313583 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.334733 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.347083 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5887d74897-rnlz9"] Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.370148 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-fa75-account-create-update-s2kng"] Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.391164 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.451330 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-internal-tls-certs\") pod \"6c25331e-14fb-47d2-aa34-d84b133255e3\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.451377 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-combined-ca-bundle\") pod \"6c25331e-14fb-47d2-aa34-d84b133255e3\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.451426 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6c25331e-14fb-47d2-aa34-d84b133255e3-ceph\") pod \"6c25331e-14fb-47d2-aa34-d84b133255e3\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.451486 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-scripts\") pod \"316eb94f-166a-4fa2-99b8-8967503eba43\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.451509 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbxxb\" (UniqueName: \"kubernetes.io/projected/316eb94f-166a-4fa2-99b8-8967503eba43-kube-api-access-pbxxb\") pod \"316eb94f-166a-4fa2-99b8-8967503eba43\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.451529 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-config-data\") pod \"6c25331e-14fb-47d2-aa34-d84b133255e3\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.451545 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-config-data\") pod \"316eb94f-166a-4fa2-99b8-8967503eba43\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.451574 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c25331e-14fb-47d2-aa34-d84b133255e3-httpd-run\") pod \"6c25331e-14fb-47d2-aa34-d84b133255e3\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.451619 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/316eb94f-166a-4fa2-99b8-8967503eba43-httpd-run\") pod \"316eb94f-166a-4fa2-99b8-8967503eba43\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.451644 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/316eb94f-166a-4fa2-99b8-8967503eba43-ceph\") pod \"316eb94f-166a-4fa2-99b8-8967503eba43\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.451678 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-public-tls-certs\") pod \"316eb94f-166a-4fa2-99b8-8967503eba43\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.451709 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"6c25331e-14fb-47d2-aa34-d84b133255e3\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.451731 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"316eb94f-166a-4fa2-99b8-8967503eba43\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.451755 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxswf\" (UniqueName: \"kubernetes.io/projected/6c25331e-14fb-47d2-aa34-d84b133255e3-kube-api-access-lxswf\") pod \"6c25331e-14fb-47d2-aa34-d84b133255e3\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.451789 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-scripts\") pod \"6c25331e-14fb-47d2-aa34-d84b133255e3\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.451808 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-combined-ca-bundle\") pod \"316eb94f-166a-4fa2-99b8-8967503eba43\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.451848 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c25331e-14fb-47d2-aa34-d84b133255e3-logs\") pod \"6c25331e-14fb-47d2-aa34-d84b133255e3\" (UID: \"6c25331e-14fb-47d2-aa34-d84b133255e3\") " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.451879 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/316eb94f-166a-4fa2-99b8-8967503eba43-logs\") pod \"316eb94f-166a-4fa2-99b8-8967503eba43\" (UID: \"316eb94f-166a-4fa2-99b8-8967503eba43\") " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.452617 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/316eb94f-166a-4fa2-99b8-8967503eba43-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "316eb94f-166a-4fa2-99b8-8967503eba43" (UID: "316eb94f-166a-4fa2-99b8-8967503eba43"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.453247 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/316eb94f-166a-4fa2-99b8-8967503eba43-logs" (OuterVolumeSpecName: "logs") pod "316eb94f-166a-4fa2-99b8-8967503eba43" (UID: "316eb94f-166a-4fa2-99b8-8967503eba43"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.456294 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c25331e-14fb-47d2-aa34-d84b133255e3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6c25331e-14fb-47d2-aa34-d84b133255e3" (UID: "6c25331e-14fb-47d2-aa34-d84b133255e3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.475136 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/316eb94f-166a-4fa2-99b8-8967503eba43-ceph" (OuterVolumeSpecName: "ceph") pod "316eb94f-166a-4fa2-99b8-8967503eba43" (UID: "316eb94f-166a-4fa2-99b8-8967503eba43"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.475226 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c25331e-14fb-47d2-aa34-d84b133255e3-kube-api-access-lxswf" (OuterVolumeSpecName: "kube-api-access-lxswf") pod "6c25331e-14fb-47d2-aa34-d84b133255e3" (UID: "6c25331e-14fb-47d2-aa34-d84b133255e3"). InnerVolumeSpecName "kube-api-access-lxswf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.475275 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c25331e-14fb-47d2-aa34-d84b133255e3-ceph" (OuterVolumeSpecName: "ceph") pod "6c25331e-14fb-47d2-aa34-d84b133255e3" (UID: "6c25331e-14fb-47d2-aa34-d84b133255e3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.488204 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "316eb94f-166a-4fa2-99b8-8967503eba43" (UID: "316eb94f-166a-4fa2-99b8-8967503eba43"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.491247 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "6c25331e-14fb-47d2-aa34-d84b133255e3" (UID: "6c25331e-14fb-47d2-aa34-d84b133255e3"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.492028 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c25331e-14fb-47d2-aa34-d84b133255e3-logs" (OuterVolumeSpecName: "logs") pod "6c25331e-14fb-47d2-aa34-d84b133255e3" (UID: "6c25331e-14fb-47d2-aa34-d84b133255e3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.509037 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "316eb94f-166a-4fa2-99b8-8967503eba43" (UID: "316eb94f-166a-4fa2-99b8-8967503eba43"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.512051 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6c25331e-14fb-47d2-aa34-d84b133255e3" (UID: "6c25331e-14fb-47d2-aa34-d84b133255e3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.512073 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-config-data" (OuterVolumeSpecName: "config-data") pod "6c25331e-14fb-47d2-aa34-d84b133255e3" (UID: "6c25331e-14fb-47d2-aa34-d84b133255e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.512116 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-scripts" (OuterVolumeSpecName: "scripts") pod "6c25331e-14fb-47d2-aa34-d84b133255e3" (UID: "6c25331e-14fb-47d2-aa34-d84b133255e3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.512132 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-config-data" (OuterVolumeSpecName: "config-data") pod "316eb94f-166a-4fa2-99b8-8967503eba43" (UID: "316eb94f-166a-4fa2-99b8-8967503eba43"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.512159 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c25331e-14fb-47d2-aa34-d84b133255e3" (UID: "6c25331e-14fb-47d2-aa34-d84b133255e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.515099 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-scripts" (OuterVolumeSpecName: "scripts") pod "316eb94f-166a-4fa2-99b8-8967503eba43" (UID: "316eb94f-166a-4fa2-99b8-8967503eba43"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.515259 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/316eb94f-166a-4fa2-99b8-8967503eba43-kube-api-access-pbxxb" (OuterVolumeSpecName: "kube-api-access-pbxxb") pod "316eb94f-166a-4fa2-99b8-8967503eba43" (UID: "316eb94f-166a-4fa2-99b8-8967503eba43"). InnerVolumeSpecName "kube-api-access-pbxxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.515414 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "316eb94f-166a-4fa2-99b8-8967503eba43" (UID: "316eb94f-166a-4fa2-99b8-8967503eba43"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.554209 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.554243 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbxxb\" (UniqueName: \"kubernetes.io/projected/316eb94f-166a-4fa2-99b8-8967503eba43-kube-api-access-pbxxb\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.554256 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.554265 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.554274 4792 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c25331e-14fb-47d2-aa34-d84b133255e3-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.554283 4792 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/316eb94f-166a-4fa2-99b8-8967503eba43-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.554293 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/316eb94f-166a-4fa2-99b8-8967503eba43-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.554301 4792 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.554322 4792 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.554336 4792 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.554346 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxswf\" (UniqueName: \"kubernetes.io/projected/6c25331e-14fb-47d2-aa34-d84b133255e3-kube-api-access-lxswf\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.554354 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.554362 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316eb94f-166a-4fa2-99b8-8967503eba43-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.554371 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c25331e-14fb-47d2-aa34-d84b133255e3-logs\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.554380 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/316eb94f-166a-4fa2-99b8-8967503eba43-logs\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.554388 4792 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.554396 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c25331e-14fb-47d2-aa34-d84b133255e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.554404 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6c25331e-14fb-47d2-aa34-d84b133255e3-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.560248 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5996ddfbb9-drpwn"] Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.589144 4792 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.628044 4792 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.656061 4792 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:52 crc kubenswrapper[4792]: I0301 10:00:52.656095 4792 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.317214 4792 generic.go:334] "Generic (PLEG): container finished" podID="c7736148-bc12-4621-a1d2-efc4a0143b42" containerID="bf57ceafd6066a28052f3666ed7d384740c5837c8329274397bd6fa48c44d661" exitCode=0 Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.317664 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-svb22" event={"ID":"c7736148-bc12-4621-a1d2-efc4a0143b42","Type":"ContainerDied","Data":"bf57ceafd6066a28052f3666ed7d384740c5837c8329274397bd6fa48c44d661"} Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.320933 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5887d74897-rnlz9" event={"ID":"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7","Type":"ContainerStarted","Data":"f6db483fc0d11feaf9f5fa34256dfeb85c12b4ea3626fb137a55f680d9cb8957"} Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.322885 4792 generic.go:334] "Generic (PLEG): container finished" podID="0441b486-847a-4f32-8df2-a1284f39ee5d" containerID="31b1b88de471cccce9a774cf2494168f31cb583ae731b5c4c5efee9a815b5533" exitCode=0 Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.322958 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-fa75-account-create-update-s2kng" event={"ID":"0441b486-847a-4f32-8df2-a1284f39ee5d","Type":"ContainerDied","Data":"31b1b88de471cccce9a774cf2494168f31cb583ae731b5c4c5efee9a815b5533"} Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.322986 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-fa75-account-create-update-s2kng" event={"ID":"0441b486-847a-4f32-8df2-a1284f39ee5d","Type":"ContainerStarted","Data":"f184175f9208322fa996d81dbf3433551442dc12615f186d77ca1d9635f54883"} Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.324447 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.325877 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5996ddfbb9-drpwn" event={"ID":"7026175e-efaf-497a-aaf1-079f2811ad08","Type":"ContainerStarted","Data":"cdd7f60094a6b70916ce6fb3a1317938714cd27149f604e9df29f382e04185b2"} Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.325994 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.503929 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.504182 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.504198 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.505667 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.514759 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.514792 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.515010 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.515123 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vwjrh" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.593285 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.593375 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6327707a-a9c5-4ba1-9c54-21cbb2e47222-ceph\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.593415 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hljh6\" (UniqueName: \"kubernetes.io/projected/6327707a-a9c5-4ba1-9c54-21cbb2e47222-kube-api-access-hljh6\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.593457 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.593619 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.593740 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.593800 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6327707a-a9c5-4ba1-9c54-21cbb2e47222-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.593836 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6327707a-a9c5-4ba1-9c54-21cbb2e47222-logs\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.593874 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.616995 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.641184 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.654514 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.663425 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.665609 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.674861 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.676858 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.677054 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.695349 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.695623 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.695726 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6327707a-a9c5-4ba1-9c54-21cbb2e47222-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.695802 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6327707a-a9c5-4ba1-9c54-21cbb2e47222-logs\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.696186 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.696326 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-config-data\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.696523 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52d866c4-a856-4930-9501-2be56e07d3ce-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.696664 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52d866c4-a856-4930-9501-2be56e07d3ce-logs\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.696753 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.696851 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6327707a-a9c5-4ba1-9c54-21cbb2e47222-ceph\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.696949 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hljh6\" (UniqueName: \"kubernetes.io/projected/6327707a-a9c5-4ba1-9c54-21cbb2e47222-kube-api-access-hljh6\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.697251 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.697378 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-scripts\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.697486 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-866lt\" (UniqueName: \"kubernetes.io/projected/52d866c4-a856-4930-9501-2be56e07d3ce-kube-api-access-866lt\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.697555 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.697214 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.697627 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.697753 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.697837 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/52d866c4-a856-4930-9501-2be56e07d3ce-ceph\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.696894 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6327707a-a9c5-4ba1-9c54-21cbb2e47222-logs\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.697147 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6327707a-a9c5-4ba1-9c54-21cbb2e47222-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.724368 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.749868 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hljh6\" (UniqueName: \"kubernetes.io/projected/6327707a-a9c5-4ba1-9c54-21cbb2e47222-kube-api-access-hljh6\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.750108 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.750590 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6327707a-a9c5-4ba1-9c54-21cbb2e47222-ceph\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.749688 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.774215 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.780684 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.807495 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-scripts\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.807697 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-866lt\" (UniqueName: \"kubernetes.io/projected/52d866c4-a856-4930-9501-2be56e07d3ce-kube-api-access-866lt\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.810345 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.810426 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.810484 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/52d866c4-a856-4930-9501-2be56e07d3ce-ceph\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.810638 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.810719 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-config-data\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.810776 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52d866c4-a856-4930-9501-2be56e07d3ce-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.810805 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52d866c4-a856-4930-9501-2be56e07d3ce-logs\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.811600 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52d866c4-a856-4930-9501-2be56e07d3ce-logs\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.813136 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52d866c4-a856-4930-9501-2be56e07d3ce-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.813333 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.827127 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-config-data\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.836821 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.841701 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/52d866c4-a856-4930-9501-2be56e07d3ce-ceph\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.844819 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-866lt\" (UniqueName: \"kubernetes.io/projected/52d866c4-a856-4930-9501-2be56e07d3ce-kube-api-access-866lt\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.876033 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5887d74897-rnlz9"] Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.876927 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.880210 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-scripts\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.943406 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-689c76c966-7mbkl"] Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.944975 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.949815 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.952664 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.966237 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:53 crc kubenswrapper[4792]: I0301 10:00:53.990373 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.010743 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.025082 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/67e060ef-1cc6-4b39-8622-bbcc183bdda0-horizon-tls-certs\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.025131 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67e060ef-1cc6-4b39-8622-bbcc183bdda0-scripts\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.025169 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/67e060ef-1cc6-4b39-8622-bbcc183bdda0-horizon-secret-key\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.025191 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67e060ef-1cc6-4b39-8622-bbcc183bdda0-config-data\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.025231 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67e060ef-1cc6-4b39-8622-bbcc183bdda0-combined-ca-bundle\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.025270 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv6tz\" (UniqueName: \"kubernetes.io/projected/67e060ef-1cc6-4b39-8622-bbcc183bdda0-kube-api-access-rv6tz\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.025341 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67e060ef-1cc6-4b39-8622-bbcc183bdda0-logs\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.039476 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-689c76c966-7mbkl"] Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.066144 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5996ddfbb9-drpwn"] Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.080361 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.098032 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-79f8cb6d9d-xg7h5"] Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.107803 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.115521 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-79f8cb6d9d-xg7h5"] Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.130074 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7f79f77-ac1b-445e-8e28-85c8964f5461-horizon-tls-certs\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.130143 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/67e060ef-1cc6-4b39-8622-bbcc183bdda0-horizon-tls-certs\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.130173 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d7f79f77-ac1b-445e-8e28-85c8964f5461-horizon-secret-key\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.130194 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67e060ef-1cc6-4b39-8622-bbcc183bdda0-scripts\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.130220 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbj7z\" (UniqueName: \"kubernetes.io/projected/d7f79f77-ac1b-445e-8e28-85c8964f5461-kube-api-access-dbj7z\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.130240 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d7f79f77-ac1b-445e-8e28-85c8964f5461-config-data\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.130258 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/67e060ef-1cc6-4b39-8622-bbcc183bdda0-horizon-secret-key\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.130279 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67e060ef-1cc6-4b39-8622-bbcc183bdda0-config-data\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.130317 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7f79f77-ac1b-445e-8e28-85c8964f5461-scripts\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.130338 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67e060ef-1cc6-4b39-8622-bbcc183bdda0-combined-ca-bundle\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.130367 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7f79f77-ac1b-445e-8e28-85c8964f5461-logs\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.130384 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv6tz\" (UniqueName: \"kubernetes.io/projected/67e060ef-1cc6-4b39-8622-bbcc183bdda0-kube-api-access-rv6tz\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.130447 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67e060ef-1cc6-4b39-8622-bbcc183bdda0-logs\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.130477 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7f79f77-ac1b-445e-8e28-85c8964f5461-combined-ca-bundle\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.132993 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67e060ef-1cc6-4b39-8622-bbcc183bdda0-config-data\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.133284 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67e060ef-1cc6-4b39-8622-bbcc183bdda0-logs\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.133571 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67e060ef-1cc6-4b39-8622-bbcc183bdda0-scripts\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.141832 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67e060ef-1cc6-4b39-8622-bbcc183bdda0-combined-ca-bundle\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.146004 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/67e060ef-1cc6-4b39-8622-bbcc183bdda0-horizon-secret-key\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.151653 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/67e060ef-1cc6-4b39-8622-bbcc183bdda0-horizon-tls-certs\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.168596 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv6tz\" (UniqueName: \"kubernetes.io/projected/67e060ef-1cc6-4b39-8622-bbcc183bdda0-kube-api-access-rv6tz\") pod \"horizon-689c76c966-7mbkl\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.240364 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7f79f77-ac1b-445e-8e28-85c8964f5461-combined-ca-bundle\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.240718 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7f79f77-ac1b-445e-8e28-85c8964f5461-horizon-tls-certs\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.240853 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d7f79f77-ac1b-445e-8e28-85c8964f5461-horizon-secret-key\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.240920 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbj7z\" (UniqueName: \"kubernetes.io/projected/d7f79f77-ac1b-445e-8e28-85c8964f5461-kube-api-access-dbj7z\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.240949 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d7f79f77-ac1b-445e-8e28-85c8964f5461-config-data\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.241011 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7f79f77-ac1b-445e-8e28-85c8964f5461-scripts\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.241078 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7f79f77-ac1b-445e-8e28-85c8964f5461-logs\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.243454 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7f79f77-ac1b-445e-8e28-85c8964f5461-scripts\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.243623 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7f79f77-ac1b-445e-8e28-85c8964f5461-logs\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.243727 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d7f79f77-ac1b-445e-8e28-85c8964f5461-config-data\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.246386 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7f79f77-ac1b-445e-8e28-85c8964f5461-horizon-tls-certs\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.264494 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7f79f77-ac1b-445e-8e28-85c8964f5461-combined-ca-bundle\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.269789 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d7f79f77-ac1b-445e-8e28-85c8964f5461-horizon-secret-key\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.280629 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbj7z\" (UniqueName: \"kubernetes.io/projected/d7f79f77-ac1b-445e-8e28-85c8964f5461-kube-api-access-dbj7z\") pod \"horizon-79f8cb6d9d-xg7h5\" (UID: \"d7f79f77-ac1b-445e-8e28-85c8964f5461\") " pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.320149 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.379714 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727","Type":"ContainerStarted","Data":"0fe3e9d9e5029b744e1fad7c8697fe586697f00dbac1e16da5e28bbf1f726170"} Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.379763 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"d3ca4743-fa6c-4e2e-b2c8-b2362f44a727","Type":"ContainerStarted","Data":"e4d4459208928b8cecfd99dceb97323b4f6b790a52fc4ce6cf99fd4332f79534"} Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.460764 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.787862 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=3.172949351 podStartE2EDuration="4.78784535s" podCreationTimestamp="2026-03-01 10:00:50 +0000 UTC" firstStartedPulling="2026-03-01 10:00:51.546478753 +0000 UTC m=+3180.788357950" lastFinishedPulling="2026-03-01 10:00:53.161374752 +0000 UTC m=+3182.403253949" observedRunningTime="2026-03-01 10:00:54.407357807 +0000 UTC m=+3183.649237004" watchObservedRunningTime="2026-03-01 10:00:54.78784535 +0000 UTC m=+3184.029724547" Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.791521 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 01 10:00:54 crc kubenswrapper[4792]: W0301 10:00:54.848510 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6327707a_a9c5_4ba1_9c54_21cbb2e47222.slice/crio-4dd7bd8f9c142eedf5828dbb684b632a29d244482b177fd704a0a7b1d31e3c86 WatchSource:0}: Error finding container 4dd7bd8f9c142eedf5828dbb684b632a29d244482b177fd704a0a7b1d31e3c86: Status 404 returned error can't find the container with id 4dd7bd8f9c142eedf5828dbb684b632a29d244482b177fd704a0a7b1d31e3c86 Mar 01 10:00:54 crc kubenswrapper[4792]: I0301 10:00:54.878156 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.076550 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-689c76c966-7mbkl"] Mar 01 10:00:55 crc kubenswrapper[4792]: W0301 10:00:55.116893 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67e060ef_1cc6_4b39_8622_bbcc183bdda0.slice/crio-ca973dc9b124eaa6124a0d372262dadb60856c6d51900f904c6c257a734ebfb0 WatchSource:0}: Error finding container ca973dc9b124eaa6124a0d372262dadb60856c6d51900f904c6c257a734ebfb0: Status 404 returned error can't find the container with id ca973dc9b124eaa6124a0d372262dadb60856c6d51900f904c6c257a734ebfb0 Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.207573 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-svb22" Mar 01 10:00:55 crc kubenswrapper[4792]: W0301 10:00:55.231446 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7f79f77_ac1b_445e_8e28_85c8964f5461.slice/crio-ac108770bf85ca6b5834e2dc59696facad649f58525fe3c2fe62f7a6209a3707 WatchSource:0}: Error finding container ac108770bf85ca6b5834e2dc59696facad649f58525fe3c2fe62f7a6209a3707: Status 404 returned error can't find the container with id ac108770bf85ca6b5834e2dc59696facad649f58525fe3c2fe62f7a6209a3707 Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.238819 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-79f8cb6d9d-xg7h5"] Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.257396 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-fa75-account-create-update-s2kng" Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.291712 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62fr5\" (UniqueName: \"kubernetes.io/projected/c7736148-bc12-4621-a1d2-efc4a0143b42-kube-api-access-62fr5\") pod \"c7736148-bc12-4621-a1d2-efc4a0143b42\" (UID: \"c7736148-bc12-4621-a1d2-efc4a0143b42\") " Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.291801 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7736148-bc12-4621-a1d2-efc4a0143b42-operator-scripts\") pod \"c7736148-bc12-4621-a1d2-efc4a0143b42\" (UID: \"c7736148-bc12-4621-a1d2-efc4a0143b42\") " Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.301523 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7736148-bc12-4621-a1d2-efc4a0143b42-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c7736148-bc12-4621-a1d2-efc4a0143b42" (UID: "c7736148-bc12-4621-a1d2-efc4a0143b42"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.317243 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7736148-bc12-4621-a1d2-efc4a0143b42-kube-api-access-62fr5" (OuterVolumeSpecName: "kube-api-access-62fr5") pod "c7736148-bc12-4621-a1d2-efc4a0143b42" (UID: "c7736148-bc12-4621-a1d2-efc4a0143b42"). InnerVolumeSpecName "kube-api-access-62fr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.401532 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbq5q\" (UniqueName: \"kubernetes.io/projected/0441b486-847a-4f32-8df2-a1284f39ee5d-kube-api-access-jbq5q\") pod \"0441b486-847a-4f32-8df2-a1284f39ee5d\" (UID: \"0441b486-847a-4f32-8df2-a1284f39ee5d\") " Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.401580 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0441b486-847a-4f32-8df2-a1284f39ee5d-operator-scripts\") pod \"0441b486-847a-4f32-8df2-a1284f39ee5d\" (UID: \"0441b486-847a-4f32-8df2-a1284f39ee5d\") " Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.402185 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62fr5\" (UniqueName: \"kubernetes.io/projected/c7736148-bc12-4621-a1d2-efc4a0143b42-kube-api-access-62fr5\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.402200 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7736148-bc12-4621-a1d2-efc4a0143b42-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.403560 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0441b486-847a-4f32-8df2-a1284f39ee5d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0441b486-847a-4f32-8df2-a1284f39ee5d" (UID: "0441b486-847a-4f32-8df2-a1284f39ee5d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.409525 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0441b486-847a-4f32-8df2-a1284f39ee5d-kube-api-access-jbq5q" (OuterVolumeSpecName: "kube-api-access-jbq5q") pod "0441b486-847a-4f32-8df2-a1284f39ee5d" (UID: "0441b486-847a-4f32-8df2-a1284f39ee5d"). InnerVolumeSpecName "kube-api-access-jbq5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.444282 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="316eb94f-166a-4fa2-99b8-8967503eba43" path="/var/lib/kubelet/pods/316eb94f-166a-4fa2-99b8-8967503eba43/volumes" Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.445143 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c25331e-14fb-47d2-aa34-d84b133255e3" path="/var/lib/kubelet/pods/6c25331e-14fb-47d2-aa34-d84b133255e3/volumes" Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.445583 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-fa75-account-create-update-s2kng" Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.449462 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6327707a-a9c5-4ba1-9c54-21cbb2e47222","Type":"ContainerStarted","Data":"4dd7bd8f9c142eedf5828dbb684b632a29d244482b177fd704a0a7b1d31e3c86"} Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.449495 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-fa75-account-create-update-s2kng" event={"ID":"0441b486-847a-4f32-8df2-a1284f39ee5d","Type":"ContainerDied","Data":"f184175f9208322fa996d81dbf3433551442dc12615f186d77ca1d9635f54883"} Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.449509 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f184175f9208322fa996d81dbf3433551442dc12615f186d77ca1d9635f54883" Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.479077 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-689c76c966-7mbkl" event={"ID":"67e060ef-1cc6-4b39-8622-bbcc183bdda0","Type":"ContainerStarted","Data":"ca973dc9b124eaa6124a0d372262dadb60856c6d51900f904c6c257a734ebfb0"} Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.504982 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbq5q\" (UniqueName: \"kubernetes.io/projected/0441b486-847a-4f32-8df2-a1284f39ee5d-kube-api-access-jbq5q\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.505007 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0441b486-847a-4f32-8df2-a1284f39ee5d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.513656 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79f8cb6d9d-xg7h5" event={"ID":"d7f79f77-ac1b-445e-8e28-85c8964f5461","Type":"ContainerStarted","Data":"ac108770bf85ca6b5834e2dc59696facad649f58525fe3c2fe62f7a6209a3707"} Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.528162 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"52d866c4-a856-4930-9501-2be56e07d3ce","Type":"ContainerStarted","Data":"73a05d008372db4bf11e47f503cd30ac75f01c346a6a7e72a1f353158728cf02"} Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.539236 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"23d15722-3d0f-44ce-ac55-eba67760f0e9","Type":"ContainerStarted","Data":"7964dba61557d3f331243b26dd157e553388e41ae7845157ebedbb8b686a988f"} Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.539283 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"23d15722-3d0f-44ce-ac55-eba67760f0e9","Type":"ContainerStarted","Data":"df777540105399ac6291b9dde10bc4d5ed076c6cb9b38e40403b88e9a6e303cc"} Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.554276 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-svb22" Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.554316 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-svb22" event={"ID":"c7736148-bc12-4621-a1d2-efc4a0143b42","Type":"ContainerDied","Data":"0f8fca8a39e93a444c277a56b4d25c74d8a2b8d4dd4ac5a12e9ce102355a7e5c"} Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.554340 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f8fca8a39e93a444c277a56b4d25c74d8a2b8d4dd4ac5a12e9ce102355a7e5c" Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.567108 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.567418 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.775778644 podStartE2EDuration="5.567409919s" podCreationTimestamp="2026-03-01 10:00:50 +0000 UTC" firstStartedPulling="2026-03-01 10:00:52.085099547 +0000 UTC m=+3181.326978734" lastFinishedPulling="2026-03-01 10:00:53.876730812 +0000 UTC m=+3183.118610009" observedRunningTime="2026-03-01 10:00:55.565505722 +0000 UTC m=+3184.807384919" watchObservedRunningTime="2026-03-01 10:00:55.567409919 +0000 UTC m=+3184.809289116" Mar 01 10:00:55 crc kubenswrapper[4792]: I0301 10:00:55.579746 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Mar 01 10:00:56 crc kubenswrapper[4792]: I0301 10:00:56.639357 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6327707a-a9c5-4ba1-9c54-21cbb2e47222","Type":"ContainerStarted","Data":"da274f80edf717de1da1c26bd84e489a22c77768488cdf1c9349a7c0e2449fda"} Mar 01 10:00:56 crc kubenswrapper[4792]: I0301 10:00:56.665382 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"52d866c4-a856-4930-9501-2be56e07d3ce","Type":"ContainerStarted","Data":"deba6fff610fdacbe233e3a2281a121eaefcf69a6c1a39995332da9dc9830ca5"} Mar 01 10:00:57 crc kubenswrapper[4792]: I0301 10:00:57.675837 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"52d866c4-a856-4930-9501-2be56e07d3ce","Type":"ContainerStarted","Data":"e4ccbed7b23e1a31ea640c9ac93a94d665b7bb34ad516870757d1a5f808133fb"} Mar 01 10:00:57 crc kubenswrapper[4792]: I0301 10:00:57.675848 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="52d866c4-a856-4930-9501-2be56e07d3ce" containerName="glance-log" containerID="cri-o://deba6fff610fdacbe233e3a2281a121eaefcf69a6c1a39995332da9dc9830ca5" gracePeriod=30 Mar 01 10:00:57 crc kubenswrapper[4792]: I0301 10:00:57.675942 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="52d866c4-a856-4930-9501-2be56e07d3ce" containerName="glance-httpd" containerID="cri-o://e4ccbed7b23e1a31ea640c9ac93a94d665b7bb34ad516870757d1a5f808133fb" gracePeriod=30 Mar 01 10:00:57 crc kubenswrapper[4792]: I0301 10:00:57.681976 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6327707a-a9c5-4ba1-9c54-21cbb2e47222","Type":"ContainerStarted","Data":"14967d4571fbfd80a32479d19d2b933beb19ce543caa25b7ae07a663d46e8838"} Mar 01 10:00:57 crc kubenswrapper[4792]: I0301 10:00:57.682100 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6327707a-a9c5-4ba1-9c54-21cbb2e47222" containerName="glance-log" containerID="cri-o://da274f80edf717de1da1c26bd84e489a22c77768488cdf1c9349a7c0e2449fda" gracePeriod=30 Mar 01 10:00:57 crc kubenswrapper[4792]: I0301 10:00:57.682207 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6327707a-a9c5-4ba1-9c54-21cbb2e47222" containerName="glance-httpd" containerID="cri-o://14967d4571fbfd80a32479d19d2b933beb19ce543caa25b7ae07a663d46e8838" gracePeriod=30 Mar 01 10:00:57 crc kubenswrapper[4792]: I0301 10:00:57.707236 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.707216462 podStartE2EDuration="4.707216462s" podCreationTimestamp="2026-03-01 10:00:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 10:00:57.697550391 +0000 UTC m=+3186.939429588" watchObservedRunningTime="2026-03-01 10:00:57.707216462 +0000 UTC m=+3186.949095659" Mar 01 10:00:57 crc kubenswrapper[4792]: I0301 10:00:57.739113 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.739096807 podStartE2EDuration="4.739096807s" podCreationTimestamp="2026-03-01 10:00:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 10:00:57.738785399 +0000 UTC m=+3186.980664596" watchObservedRunningTime="2026-03-01 10:00:57.739096807 +0000 UTC m=+3186.980976004" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.501037 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.614614 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52d866c4-a856-4930-9501-2be56e07d3ce-logs\") pod \"52d866c4-a856-4930-9501-2be56e07d3ce\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.614683 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"52d866c4-a856-4930-9501-2be56e07d3ce\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.614754 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52d866c4-a856-4930-9501-2be56e07d3ce-httpd-run\") pod \"52d866c4-a856-4930-9501-2be56e07d3ce\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.614864 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-866lt\" (UniqueName: \"kubernetes.io/projected/52d866c4-a856-4930-9501-2be56e07d3ce-kube-api-access-866lt\") pod \"52d866c4-a856-4930-9501-2be56e07d3ce\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.614898 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-public-tls-certs\") pod \"52d866c4-a856-4930-9501-2be56e07d3ce\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.615009 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-config-data\") pod \"52d866c4-a856-4930-9501-2be56e07d3ce\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.615038 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/52d866c4-a856-4930-9501-2be56e07d3ce-ceph\") pod \"52d866c4-a856-4930-9501-2be56e07d3ce\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.615134 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-combined-ca-bundle\") pod \"52d866c4-a856-4930-9501-2be56e07d3ce\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.615490 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-scripts\") pod \"52d866c4-a856-4930-9501-2be56e07d3ce\" (UID: \"52d866c4-a856-4930-9501-2be56e07d3ce\") " Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.618238 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52d866c4-a856-4930-9501-2be56e07d3ce-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "52d866c4-a856-4930-9501-2be56e07d3ce" (UID: "52d866c4-a856-4930-9501-2be56e07d3ce"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.618823 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52d866c4-a856-4930-9501-2be56e07d3ce-logs" (OuterVolumeSpecName: "logs") pod "52d866c4-a856-4930-9501-2be56e07d3ce" (UID: "52d866c4-a856-4930-9501-2be56e07d3ce"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.627651 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52d866c4-a856-4930-9501-2be56e07d3ce-kube-api-access-866lt" (OuterVolumeSpecName: "kube-api-access-866lt") pod "52d866c4-a856-4930-9501-2be56e07d3ce" (UID: "52d866c4-a856-4930-9501-2be56e07d3ce"). InnerVolumeSpecName "kube-api-access-866lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.628065 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-scripts" (OuterVolumeSpecName: "scripts") pod "52d866c4-a856-4930-9501-2be56e07d3ce" (UID: "52d866c4-a856-4930-9501-2be56e07d3ce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.629021 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "52d866c4-a856-4930-9501-2be56e07d3ce" (UID: "52d866c4-a856-4930-9501-2be56e07d3ce"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.642988 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52d866c4-a856-4930-9501-2be56e07d3ce-ceph" (OuterVolumeSpecName: "ceph") pod "52d866c4-a856-4930-9501-2be56e07d3ce" (UID: "52d866c4-a856-4930-9501-2be56e07d3ce"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.660608 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52d866c4-a856-4930-9501-2be56e07d3ce" (UID: "52d866c4-a856-4930-9501-2be56e07d3ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.720881 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.721203 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.721214 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52d866c4-a856-4930-9501-2be56e07d3ce-logs\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.721236 4792 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.721245 4792 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52d866c4-a856-4930-9501-2be56e07d3ce-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.721254 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-866lt\" (UniqueName: \"kubernetes.io/projected/52d866c4-a856-4930-9501-2be56e07d3ce-kube-api-access-866lt\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.721264 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/52d866c4-a856-4930-9501-2be56e07d3ce-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.735148 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-config-data" (OuterVolumeSpecName: "config-data") pod "52d866c4-a856-4930-9501-2be56e07d3ce" (UID: "52d866c4-a856-4930-9501-2be56e07d3ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.736083 4792 generic.go:334] "Generic (PLEG): container finished" podID="6327707a-a9c5-4ba1-9c54-21cbb2e47222" containerID="14967d4571fbfd80a32479d19d2b933beb19ce543caa25b7ae07a663d46e8838" exitCode=143 Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.736394 4792 generic.go:334] "Generic (PLEG): container finished" podID="6327707a-a9c5-4ba1-9c54-21cbb2e47222" containerID="da274f80edf717de1da1c26bd84e489a22c77768488cdf1c9349a7c0e2449fda" exitCode=143 Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.736658 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6327707a-a9c5-4ba1-9c54-21cbb2e47222","Type":"ContainerDied","Data":"14967d4571fbfd80a32479d19d2b933beb19ce543caa25b7ae07a663d46e8838"} Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.737249 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6327707a-a9c5-4ba1-9c54-21cbb2e47222","Type":"ContainerDied","Data":"da274f80edf717de1da1c26bd84e489a22c77768488cdf1c9349a7c0e2449fda"} Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.749238 4792 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.760970 4792 generic.go:334] "Generic (PLEG): container finished" podID="52d866c4-a856-4930-9501-2be56e07d3ce" containerID="e4ccbed7b23e1a31ea640c9ac93a94d665b7bb34ad516870757d1a5f808133fb" exitCode=143 Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.761005 4792 generic.go:334] "Generic (PLEG): container finished" podID="52d866c4-a856-4930-9501-2be56e07d3ce" containerID="deba6fff610fdacbe233e3a2281a121eaefcf69a6c1a39995332da9dc9830ca5" exitCode=143 Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.761029 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"52d866c4-a856-4930-9501-2be56e07d3ce","Type":"ContainerDied","Data":"e4ccbed7b23e1a31ea640c9ac93a94d665b7bb34ad516870757d1a5f808133fb"} Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.761056 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"52d866c4-a856-4930-9501-2be56e07d3ce","Type":"ContainerDied","Data":"deba6fff610fdacbe233e3a2281a121eaefcf69a6c1a39995332da9dc9830ca5"} Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.761065 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"52d866c4-a856-4930-9501-2be56e07d3ce","Type":"ContainerDied","Data":"73a05d008372db4bf11e47f503cd30ac75f01c346a6a7e72a1f353158728cf02"} Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.761089 4792 scope.go:117] "RemoveContainer" containerID="e4ccbed7b23e1a31ea640c9ac93a94d665b7bb34ad516870757d1a5f808133fb" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.761475 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.819036 4792 scope.go:117] "RemoveContainer" containerID="deba6fff610fdacbe233e3a2281a121eaefcf69a6c1a39995332da9dc9830ca5" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.829522 4792 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.829540 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.830297 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "52d866c4-a856-4930-9501-2be56e07d3ce" (UID: "52d866c4-a856-4930-9501-2be56e07d3ce"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.934808 4792 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52d866c4-a856-4930-9501-2be56e07d3ce-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.957259 4792 scope.go:117] "RemoveContainer" containerID="e4ccbed7b23e1a31ea640c9ac93a94d665b7bb34ad516870757d1a5f808133fb" Mar 01 10:00:58 crc kubenswrapper[4792]: E0301 10:00:58.962268 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4ccbed7b23e1a31ea640c9ac93a94d665b7bb34ad516870757d1a5f808133fb\": container with ID starting with e4ccbed7b23e1a31ea640c9ac93a94d665b7bb34ad516870757d1a5f808133fb not found: ID does not exist" containerID="e4ccbed7b23e1a31ea640c9ac93a94d665b7bb34ad516870757d1a5f808133fb" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.962312 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4ccbed7b23e1a31ea640c9ac93a94d665b7bb34ad516870757d1a5f808133fb"} err="failed to get container status \"e4ccbed7b23e1a31ea640c9ac93a94d665b7bb34ad516870757d1a5f808133fb\": rpc error: code = NotFound desc = could not find container \"e4ccbed7b23e1a31ea640c9ac93a94d665b7bb34ad516870757d1a5f808133fb\": container with ID starting with e4ccbed7b23e1a31ea640c9ac93a94d665b7bb34ad516870757d1a5f808133fb not found: ID does not exist" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.962338 4792 scope.go:117] "RemoveContainer" containerID="deba6fff610fdacbe233e3a2281a121eaefcf69a6c1a39995332da9dc9830ca5" Mar 01 10:00:58 crc kubenswrapper[4792]: E0301 10:00:58.962875 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deba6fff610fdacbe233e3a2281a121eaefcf69a6c1a39995332da9dc9830ca5\": container with ID starting with deba6fff610fdacbe233e3a2281a121eaefcf69a6c1a39995332da9dc9830ca5 not found: ID does not exist" containerID="deba6fff610fdacbe233e3a2281a121eaefcf69a6c1a39995332da9dc9830ca5" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.962929 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deba6fff610fdacbe233e3a2281a121eaefcf69a6c1a39995332da9dc9830ca5"} err="failed to get container status \"deba6fff610fdacbe233e3a2281a121eaefcf69a6c1a39995332da9dc9830ca5\": rpc error: code = NotFound desc = could not find container \"deba6fff610fdacbe233e3a2281a121eaefcf69a6c1a39995332da9dc9830ca5\": container with ID starting with deba6fff610fdacbe233e3a2281a121eaefcf69a6c1a39995332da9dc9830ca5 not found: ID does not exist" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.962944 4792 scope.go:117] "RemoveContainer" containerID="e4ccbed7b23e1a31ea640c9ac93a94d665b7bb34ad516870757d1a5f808133fb" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.963983 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4ccbed7b23e1a31ea640c9ac93a94d665b7bb34ad516870757d1a5f808133fb"} err="failed to get container status \"e4ccbed7b23e1a31ea640c9ac93a94d665b7bb34ad516870757d1a5f808133fb\": rpc error: code = NotFound desc = could not find container \"e4ccbed7b23e1a31ea640c9ac93a94d665b7bb34ad516870757d1a5f808133fb\": container with ID starting with e4ccbed7b23e1a31ea640c9ac93a94d665b7bb34ad516870757d1a5f808133fb not found: ID does not exist" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.964029 4792 scope.go:117] "RemoveContainer" containerID="deba6fff610fdacbe233e3a2281a121eaefcf69a6c1a39995332da9dc9830ca5" Mar 01 10:00:58 crc kubenswrapper[4792]: I0301 10:00:58.964456 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deba6fff610fdacbe233e3a2281a121eaefcf69a6c1a39995332da9dc9830ca5"} err="failed to get container status \"deba6fff610fdacbe233e3a2281a121eaefcf69a6c1a39995332da9dc9830ca5\": rpc error: code = NotFound desc = could not find container \"deba6fff610fdacbe233e3a2281a121eaefcf69a6c1a39995332da9dc9830ca5\": container with ID starting with deba6fff610fdacbe233e3a2281a121eaefcf69a6c1a39995332da9dc9830ca5 not found: ID does not exist" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.110300 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.133824 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.143994 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.162097 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 01 10:00:59 crc kubenswrapper[4792]: E0301 10:00:59.162685 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0441b486-847a-4f32-8df2-a1284f39ee5d" containerName="mariadb-account-create-update" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.162709 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0441b486-847a-4f32-8df2-a1284f39ee5d" containerName="mariadb-account-create-update" Mar 01 10:00:59 crc kubenswrapper[4792]: E0301 10:00:59.162730 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7736148-bc12-4621-a1d2-efc4a0143b42" containerName="mariadb-database-create" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.162737 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7736148-bc12-4621-a1d2-efc4a0143b42" containerName="mariadb-database-create" Mar 01 10:00:59 crc kubenswrapper[4792]: E0301 10:00:59.162756 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52d866c4-a856-4930-9501-2be56e07d3ce" containerName="glance-log" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.162764 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="52d866c4-a856-4930-9501-2be56e07d3ce" containerName="glance-log" Mar 01 10:00:59 crc kubenswrapper[4792]: E0301 10:00:59.162788 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6327707a-a9c5-4ba1-9c54-21cbb2e47222" containerName="glance-log" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.162796 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6327707a-a9c5-4ba1-9c54-21cbb2e47222" containerName="glance-log" Mar 01 10:00:59 crc kubenswrapper[4792]: E0301 10:00:59.162810 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52d866c4-a856-4930-9501-2be56e07d3ce" containerName="glance-httpd" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.162818 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="52d866c4-a856-4930-9501-2be56e07d3ce" containerName="glance-httpd" Mar 01 10:00:59 crc kubenswrapper[4792]: E0301 10:00:59.162831 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6327707a-a9c5-4ba1-9c54-21cbb2e47222" containerName="glance-httpd" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.162839 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6327707a-a9c5-4ba1-9c54-21cbb2e47222" containerName="glance-httpd" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.163072 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="52d866c4-a856-4930-9501-2be56e07d3ce" containerName="glance-log" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.163121 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7736148-bc12-4621-a1d2-efc4a0143b42" containerName="mariadb-database-create" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.163137 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="52d866c4-a856-4930-9501-2be56e07d3ce" containerName="glance-httpd" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.163152 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6327707a-a9c5-4ba1-9c54-21cbb2e47222" containerName="glance-log" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.163162 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6327707a-a9c5-4ba1-9c54-21cbb2e47222" containerName="glance-httpd" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.163176 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0441b486-847a-4f32-8df2-a1284f39ee5d" containerName="mariadb-account-create-update" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.164244 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.168055 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.168651 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.189217 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.248507 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6327707a-a9c5-4ba1-9c54-21cbb2e47222-ceph\") pod \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.248612 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6327707a-a9c5-4ba1-9c54-21cbb2e47222-httpd-run\") pod \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.248648 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.248737 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hljh6\" (UniqueName: \"kubernetes.io/projected/6327707a-a9c5-4ba1-9c54-21cbb2e47222-kube-api-access-hljh6\") pod \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.248780 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-scripts\") pod \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.248810 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-internal-tls-certs\") pod \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.248959 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6327707a-a9c5-4ba1-9c54-21cbb2e47222-logs\") pod \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.248990 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-config-data\") pod \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.249012 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-combined-ca-bundle\") pod \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\" (UID: \"6327707a-a9c5-4ba1-9c54-21cbb2e47222\") " Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.249350 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52b189da-3327-40c1-bf22-a842b0980593-scripts\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.249392 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52b189da-3327-40c1-bf22-a842b0980593-config-data\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.249443 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnt28\" (UniqueName: \"kubernetes.io/projected/52b189da-3327-40c1-bf22-a842b0980593-kube-api-access-jnt28\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.249477 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52b189da-3327-40c1-bf22-a842b0980593-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.249500 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/52b189da-3327-40c1-bf22-a842b0980593-ceph\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.249546 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52b189da-3327-40c1-bf22-a842b0980593-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.249585 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.249642 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52b189da-3327-40c1-bf22-a842b0980593-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.249725 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52b189da-3327-40c1-bf22-a842b0980593-logs\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.252661 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6327707a-a9c5-4ba1-9c54-21cbb2e47222-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6327707a-a9c5-4ba1-9c54-21cbb2e47222" (UID: "6327707a-a9c5-4ba1-9c54-21cbb2e47222"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.253619 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6327707a-a9c5-4ba1-9c54-21cbb2e47222-logs" (OuterVolumeSpecName: "logs") pod "6327707a-a9c5-4ba1-9c54-21cbb2e47222" (UID: "6327707a-a9c5-4ba1-9c54-21cbb2e47222"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.256692 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6327707a-a9c5-4ba1-9c54-21cbb2e47222-kube-api-access-hljh6" (OuterVolumeSpecName: "kube-api-access-hljh6") pod "6327707a-a9c5-4ba1-9c54-21cbb2e47222" (UID: "6327707a-a9c5-4ba1-9c54-21cbb2e47222"). InnerVolumeSpecName "kube-api-access-hljh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.258290 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "6327707a-a9c5-4ba1-9c54-21cbb2e47222" (UID: "6327707a-a9c5-4ba1-9c54-21cbb2e47222"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.262384 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6327707a-a9c5-4ba1-9c54-21cbb2e47222-ceph" (OuterVolumeSpecName: "ceph") pod "6327707a-a9c5-4ba1-9c54-21cbb2e47222" (UID: "6327707a-a9c5-4ba1-9c54-21cbb2e47222"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.274391 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-scripts" (OuterVolumeSpecName: "scripts") pod "6327707a-a9c5-4ba1-9c54-21cbb2e47222" (UID: "6327707a-a9c5-4ba1-9c54-21cbb2e47222"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.285473 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6327707a-a9c5-4ba1-9c54-21cbb2e47222" (UID: "6327707a-a9c5-4ba1-9c54-21cbb2e47222"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.342767 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-config-data" (OuterVolumeSpecName: "config-data") pod "6327707a-a9c5-4ba1-9c54-21cbb2e47222" (UID: "6327707a-a9c5-4ba1-9c54-21cbb2e47222"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.348178 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6327707a-a9c5-4ba1-9c54-21cbb2e47222" (UID: "6327707a-a9c5-4ba1-9c54-21cbb2e47222"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.350940 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52b189da-3327-40c1-bf22-a842b0980593-logs\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.351024 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52b189da-3327-40c1-bf22-a842b0980593-scripts\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.351049 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52b189da-3327-40c1-bf22-a842b0980593-config-data\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.351088 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnt28\" (UniqueName: \"kubernetes.io/projected/52b189da-3327-40c1-bf22-a842b0980593-kube-api-access-jnt28\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.351109 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52b189da-3327-40c1-bf22-a842b0980593-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.351126 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/52b189da-3327-40c1-bf22-a842b0980593-ceph\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.351193 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52b189da-3327-40c1-bf22-a842b0980593-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.351218 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.351254 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52b189da-3327-40c1-bf22-a842b0980593-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.351314 4792 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.351426 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6327707a-a9c5-4ba1-9c54-21cbb2e47222-logs\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.351438 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.351447 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.351455 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6327707a-a9c5-4ba1-9c54-21cbb2e47222-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.351463 4792 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6327707a-a9c5-4ba1-9c54-21cbb2e47222-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.351481 4792 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.351492 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hljh6\" (UniqueName: \"kubernetes.io/projected/6327707a-a9c5-4ba1-9c54-21cbb2e47222-kube-api-access-hljh6\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.351500 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6327707a-a9c5-4ba1-9c54-21cbb2e47222-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.351527 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52b189da-3327-40c1-bf22-a842b0980593-logs\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.352640 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.353856 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52b189da-3327-40c1-bf22-a842b0980593-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.354627 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52b189da-3327-40c1-bf22-a842b0980593-scripts\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.356040 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52b189da-3327-40c1-bf22-a842b0980593-config-data\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.358070 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/52b189da-3327-40c1-bf22-a842b0980593-ceph\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.366004 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52b189da-3327-40c1-bf22-a842b0980593-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.370892 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnt28\" (UniqueName: \"kubernetes.io/projected/52b189da-3327-40c1-bf22-a842b0980593-kube-api-access-jnt28\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.371063 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52b189da-3327-40c1-bf22-a842b0980593-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.394343 4792 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.395431 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"52b189da-3327-40c1-bf22-a842b0980593\") " pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.425337 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52d866c4-a856-4930-9501-2be56e07d3ce" path="/var/lib/kubelet/pods/52d866c4-a856-4930-9501-2be56e07d3ce/volumes" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.452825 4792 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.489560 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.780155 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6327707a-a9c5-4ba1-9c54-21cbb2e47222","Type":"ContainerDied","Data":"4dd7bd8f9c142eedf5828dbb684b632a29d244482b177fd704a0a7b1d31e3c86"} Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.780489 4792 scope.go:117] "RemoveContainer" containerID="14967d4571fbfd80a32479d19d2b933beb19ce543caa25b7ae07a663d46e8838" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.780192 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.815016 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.829460 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.852043 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.853638 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.856684 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.857045 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.897843 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.966998 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d055103-6c35-481f-820a-7aa363543404-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.967485 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6zm2\" (UniqueName: \"kubernetes.io/projected/9d055103-6c35-481f-820a-7aa363543404-kube-api-access-b6zm2\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.967561 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d055103-6c35-481f-820a-7aa363543404-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.967626 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9d055103-6c35-481f-820a-7aa363543404-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.967645 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d055103-6c35-481f-820a-7aa363543404-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.967794 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d055103-6c35-481f-820a-7aa363543404-logs\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.967855 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.967901 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d055103-6c35-481f-820a-7aa363543404-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:00:59 crc kubenswrapper[4792]: I0301 10:00:59.967955 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9d055103-6c35-481f-820a-7aa363543404-ceph\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.042572 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.070625 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9d055103-6c35-481f-820a-7aa363543404-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.070671 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d055103-6c35-481f-820a-7aa363543404-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.070753 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d055103-6c35-481f-820a-7aa363543404-logs\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.070825 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.070907 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d055103-6c35-481f-820a-7aa363543404-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.071017 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9d055103-6c35-481f-820a-7aa363543404-ceph\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.071175 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d055103-6c35-481f-820a-7aa363543404-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.072201 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6zm2\" (UniqueName: \"kubernetes.io/projected/9d055103-6c35-481f-820a-7aa363543404-kube-api-access-b6zm2\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.072278 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d055103-6c35-481f-820a-7aa363543404-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.072323 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d055103-6c35-481f-820a-7aa363543404-logs\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.071356 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.071200 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9d055103-6c35-481f-820a-7aa363543404-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.077125 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d055103-6c35-481f-820a-7aa363543404-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.079400 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d055103-6c35-481f-820a-7aa363543404-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.081360 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d055103-6c35-481f-820a-7aa363543404-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.088758 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6zm2\" (UniqueName: \"kubernetes.io/projected/9d055103-6c35-481f-820a-7aa363543404-kube-api-access-b6zm2\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.088841 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9d055103-6c35-481f-820a-7aa363543404-ceph\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.097371 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d055103-6c35-481f-820a-7aa363543404-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.133609 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"9d055103-6c35-481f-820a-7aa363543404\") " pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.144807 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29539321-sclgm"] Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.146353 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29539321-sclgm" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.157613 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29539321-sclgm"] Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.202830 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.278277 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72wdc\" (UniqueName: \"kubernetes.io/projected/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-kube-api-access-72wdc\") pod \"keystone-cron-29539321-sclgm\" (UID: \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\") " pod="openstack/keystone-cron-29539321-sclgm" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.278356 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-combined-ca-bundle\") pod \"keystone-cron-29539321-sclgm\" (UID: \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\") " pod="openstack/keystone-cron-29539321-sclgm" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.278385 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-config-data\") pod \"keystone-cron-29539321-sclgm\" (UID: \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\") " pod="openstack/keystone-cron-29539321-sclgm" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.278420 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-fernet-keys\") pod \"keystone-cron-29539321-sclgm\" (UID: \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\") " pod="openstack/keystone-cron-29539321-sclgm" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.379716 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72wdc\" (UniqueName: \"kubernetes.io/projected/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-kube-api-access-72wdc\") pod \"keystone-cron-29539321-sclgm\" (UID: \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\") " pod="openstack/keystone-cron-29539321-sclgm" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.379803 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-combined-ca-bundle\") pod \"keystone-cron-29539321-sclgm\" (UID: \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\") " pod="openstack/keystone-cron-29539321-sclgm" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.379822 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-config-data\") pod \"keystone-cron-29539321-sclgm\" (UID: \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\") " pod="openstack/keystone-cron-29539321-sclgm" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.379857 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-fernet-keys\") pod \"keystone-cron-29539321-sclgm\" (UID: \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\") " pod="openstack/keystone-cron-29539321-sclgm" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.383781 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-combined-ca-bundle\") pod \"keystone-cron-29539321-sclgm\" (UID: \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\") " pod="openstack/keystone-cron-29539321-sclgm" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.416292 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-fernet-keys\") pod \"keystone-cron-29539321-sclgm\" (UID: \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\") " pod="openstack/keystone-cron-29539321-sclgm" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.416352 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72wdc\" (UniqueName: \"kubernetes.io/projected/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-kube-api-access-72wdc\") pod \"keystone-cron-29539321-sclgm\" (UID: \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\") " pod="openstack/keystone-cron-29539321-sclgm" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.419142 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-config-data\") pod \"keystone-cron-29539321-sclgm\" (UID: \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\") " pod="openstack/keystone-cron-29539321-sclgm" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.539241 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29539321-sclgm" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.769960 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Mar 01 10:01:00 crc kubenswrapper[4792]: I0301 10:01:00.873563 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.432620 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6327707a-a9c5-4ba1-9c54-21cbb2e47222" path="/var/lib/kubelet/pods/6327707a-a9c5-4ba1-9c54-21cbb2e47222/volumes" Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.491566 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-p2dtn"] Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.493074 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-p2dtn" Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.504097 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.504789 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-x2pjp" Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.527867 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-p2dtn"] Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.617611 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/01bf5dae-6217-4644-9c9b-65d3886a4dc1-job-config-data\") pod \"manila-db-sync-p2dtn\" (UID: \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\") " pod="openstack/manila-db-sync-p2dtn" Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.617704 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01bf5dae-6217-4644-9c9b-65d3886a4dc1-combined-ca-bundle\") pod \"manila-db-sync-p2dtn\" (UID: \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\") " pod="openstack/manila-db-sync-p2dtn" Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.617764 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01bf5dae-6217-4644-9c9b-65d3886a4dc1-config-data\") pod \"manila-db-sync-p2dtn\" (UID: \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\") " pod="openstack/manila-db-sync-p2dtn" Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.617809 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8pgg\" (UniqueName: \"kubernetes.io/projected/01bf5dae-6217-4644-9c9b-65d3886a4dc1-kube-api-access-n8pgg\") pod \"manila-db-sync-p2dtn\" (UID: \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\") " pod="openstack/manila-db-sync-p2dtn" Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.719553 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01bf5dae-6217-4644-9c9b-65d3886a4dc1-combined-ca-bundle\") pod \"manila-db-sync-p2dtn\" (UID: \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\") " pod="openstack/manila-db-sync-p2dtn" Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.719636 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01bf5dae-6217-4644-9c9b-65d3886a4dc1-config-data\") pod \"manila-db-sync-p2dtn\" (UID: \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\") " pod="openstack/manila-db-sync-p2dtn" Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.719678 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8pgg\" (UniqueName: \"kubernetes.io/projected/01bf5dae-6217-4644-9c9b-65d3886a4dc1-kube-api-access-n8pgg\") pod \"manila-db-sync-p2dtn\" (UID: \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\") " pod="openstack/manila-db-sync-p2dtn" Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.719768 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/01bf5dae-6217-4644-9c9b-65d3886a4dc1-job-config-data\") pod \"manila-db-sync-p2dtn\" (UID: \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\") " pod="openstack/manila-db-sync-p2dtn" Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.730478 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/01bf5dae-6217-4644-9c9b-65d3886a4dc1-job-config-data\") pod \"manila-db-sync-p2dtn\" (UID: \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\") " pod="openstack/manila-db-sync-p2dtn" Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.739365 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8pgg\" (UniqueName: \"kubernetes.io/projected/01bf5dae-6217-4644-9c9b-65d3886a4dc1-kube-api-access-n8pgg\") pod \"manila-db-sync-p2dtn\" (UID: \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\") " pod="openstack/manila-db-sync-p2dtn" Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.740112 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01bf5dae-6217-4644-9c9b-65d3886a4dc1-combined-ca-bundle\") pod \"manila-db-sync-p2dtn\" (UID: \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\") " pod="openstack/manila-db-sync-p2dtn" Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.745047 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01bf5dae-6217-4644-9c9b-65d3886a4dc1-config-data\") pod \"manila-db-sync-p2dtn\" (UID: \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\") " pod="openstack/manila-db-sync-p2dtn" Mar 01 10:01:01 crc kubenswrapper[4792]: I0301 10:01:01.822029 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-p2dtn" Mar 01 10:01:04 crc kubenswrapper[4792]: I0301 10:01:04.943281 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:01:04 crc kubenswrapper[4792]: I0301 10:01:04.943664 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:01:05 crc kubenswrapper[4792]: W0301 10:01:05.666576 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52b189da_3327_40c1_bf22_a842b0980593.slice/crio-3a9834ee2fc0174c61ddd45d2bc8e20177af7786aa6702cb7d4060cedc520537 WatchSource:0}: Error finding container 3a9834ee2fc0174c61ddd45d2bc8e20177af7786aa6702cb7d4060cedc520537: Status 404 returned error can't find the container with id 3a9834ee2fc0174c61ddd45d2bc8e20177af7786aa6702cb7d4060cedc520537 Mar 01 10:01:05 crc kubenswrapper[4792]: I0301 10:01:05.708767 4792 scope.go:117] "RemoveContainer" containerID="da274f80edf717de1da1c26bd84e489a22c77768488cdf1c9349a7c0e2449fda" Mar 01 10:01:05 crc kubenswrapper[4792]: I0301 10:01:05.884341 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"52b189da-3327-40c1-bf22-a842b0980593","Type":"ContainerStarted","Data":"3a9834ee2fc0174c61ddd45d2bc8e20177af7786aa6702cb7d4060cedc520537"} Mar 01 10:01:06 crc kubenswrapper[4792]: I0301 10:01:06.318942 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 01 10:01:06 crc kubenswrapper[4792]: I0301 10:01:06.437395 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29539321-sclgm"] Mar 01 10:01:06 crc kubenswrapper[4792]: W0301 10:01:06.515216 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ec04609_b280_4df0_a0c5_2e4c7208c1c6.slice/crio-9eb42b80e67cd6ee111c52690140db4093810e4ed24e856419a0421b142b5cae WatchSource:0}: Error finding container 9eb42b80e67cd6ee111c52690140db4093810e4ed24e856419a0421b142b5cae: Status 404 returned error can't find the container with id 9eb42b80e67cd6ee111c52690140db4093810e4ed24e856419a0421b142b5cae Mar 01 10:01:06 crc kubenswrapper[4792]: I0301 10:01:06.532078 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-p2dtn"] Mar 01 10:01:06 crc kubenswrapper[4792]: W0301 10:01:06.570935 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01bf5dae_6217_4644_9c9b_65d3886a4dc1.slice/crio-602a6322b970b1abc9872680d2570a6e260b8dc7e4033e670c6368378000d632 WatchSource:0}: Error finding container 602a6322b970b1abc9872680d2570a6e260b8dc7e4033e670c6368378000d632: Status 404 returned error can't find the container with id 602a6322b970b1abc9872680d2570a6e260b8dc7e4033e670c6368378000d632 Mar 01 10:01:06 crc kubenswrapper[4792]: I0301 10:01:06.589688 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 01 10:01:06 crc kubenswrapper[4792]: I0301 10:01:06.901889 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5996ddfbb9-drpwn" podUID="7026175e-efaf-497a-aaf1-079f2811ad08" containerName="horizon-log" containerID="cri-o://b6c7806f191bce291b1fc9008b4e9e5c2b3a84dce2dc94634b54dd102f025baa" gracePeriod=30 Mar 01 10:01:06 crc kubenswrapper[4792]: I0301 10:01:06.901883 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5996ddfbb9-drpwn" event={"ID":"7026175e-efaf-497a-aaf1-079f2811ad08","Type":"ContainerStarted","Data":"54af972835bf1ffc5525cfd1de0f83cd8659d1c005a950a21016d293f0b51bac"} Mar 01 10:01:06 crc kubenswrapper[4792]: I0301 10:01:06.902398 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5996ddfbb9-drpwn" event={"ID":"7026175e-efaf-497a-aaf1-079f2811ad08","Type":"ContainerStarted","Data":"b6c7806f191bce291b1fc9008b4e9e5c2b3a84dce2dc94634b54dd102f025baa"} Mar 01 10:01:06 crc kubenswrapper[4792]: I0301 10:01:06.902521 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5996ddfbb9-drpwn" podUID="7026175e-efaf-497a-aaf1-079f2811ad08" containerName="horizon" containerID="cri-o://54af972835bf1ffc5525cfd1de0f83cd8659d1c005a950a21016d293f0b51bac" gracePeriod=30 Mar 01 10:01:06 crc kubenswrapper[4792]: I0301 10:01:06.907119 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79f8cb6d9d-xg7h5" event={"ID":"d7f79f77-ac1b-445e-8e28-85c8964f5461","Type":"ContainerStarted","Data":"e344c821e4bd4422c2a14dbebedafc5386ba96ddbd63c7249bebc2482072eb00"} Mar 01 10:01:06 crc kubenswrapper[4792]: I0301 10:01:06.909611 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9d055103-6c35-481f-820a-7aa363543404","Type":"ContainerStarted","Data":"10cf97f98099f7739343179f7e9a4039e3126ba719c407bf92cd7c8fe310ddc5"} Mar 01 10:01:06 crc kubenswrapper[4792]: I0301 10:01:06.911609 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5887d74897-rnlz9" event={"ID":"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7","Type":"ContainerStarted","Data":"b9cdb74c5cb376a6549f73b6d93daac9b503601025b178e429bb6ba097174373"} Mar 01 10:01:06 crc kubenswrapper[4792]: I0301 10:01:06.918182 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-p2dtn" event={"ID":"01bf5dae-6217-4644-9c9b-65d3886a4dc1","Type":"ContainerStarted","Data":"602a6322b970b1abc9872680d2570a6e260b8dc7e4033e670c6368378000d632"} Mar 01 10:01:06 crc kubenswrapper[4792]: I0301 10:01:06.926053 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-689c76c966-7mbkl" event={"ID":"67e060ef-1cc6-4b39-8622-bbcc183bdda0","Type":"ContainerStarted","Data":"809c1ff3a35d23863d9e774c5c257d32ce27f54609e7871a540c404967d6246b"} Mar 01 10:01:06 crc kubenswrapper[4792]: I0301 10:01:06.926132 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-689c76c966-7mbkl" event={"ID":"67e060ef-1cc6-4b39-8622-bbcc183bdda0","Type":"ContainerStarted","Data":"7eefa3b27a0172eca37fa354a49dffdfcea57a296cf27db9b99db7f3391b92ce"} Mar 01 10:01:06 crc kubenswrapper[4792]: I0301 10:01:06.933870 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5996ddfbb9-drpwn" podStartSLOduration=2.582071496 podStartE2EDuration="15.933847055s" podCreationTimestamp="2026-03-01 10:00:51 +0000 UTC" firstStartedPulling="2026-03-01 10:00:52.598505993 +0000 UTC m=+3181.840385190" lastFinishedPulling="2026-03-01 10:01:05.950281552 +0000 UTC m=+3195.192160749" observedRunningTime="2026-03-01 10:01:06.923572589 +0000 UTC m=+3196.165451786" watchObservedRunningTime="2026-03-01 10:01:06.933847055 +0000 UTC m=+3196.175726252" Mar 01 10:01:06 crc kubenswrapper[4792]: I0301 10:01:06.935985 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29539321-sclgm" event={"ID":"7ec04609-b280-4df0-a0c5-2e4c7208c1c6","Type":"ContainerStarted","Data":"9eb42b80e67cd6ee111c52690140db4093810e4ed24e856419a0421b142b5cae"} Mar 01 10:01:07 crc kubenswrapper[4792]: I0301 10:01:07.961756 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29539321-sclgm" event={"ID":"7ec04609-b280-4df0-a0c5-2e4c7208c1c6","Type":"ContainerStarted","Data":"45c567ce10495b75abefb1478976cbc609499a5f59a4a0f633c7587b44678000"} Mar 01 10:01:07 crc kubenswrapper[4792]: I0301 10:01:07.965613 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79f8cb6d9d-xg7h5" event={"ID":"d7f79f77-ac1b-445e-8e28-85c8964f5461","Type":"ContainerStarted","Data":"0cf91c9a522aacb2134a6cd7d91d0553f3efca2cb569fa0827511bb944ef8438"} Mar 01 10:01:07 crc kubenswrapper[4792]: I0301 10:01:07.975523 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9d055103-6c35-481f-820a-7aa363543404","Type":"ContainerStarted","Data":"7d2cc88f8a6f37e266989b6de9d788f976fe55173800f18c1f69b4a860e6c34f"} Mar 01 10:01:07 crc kubenswrapper[4792]: I0301 10:01:07.985572 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-689c76c966-7mbkl" podStartSLOduration=4.156399664 podStartE2EDuration="14.985557338s" podCreationTimestamp="2026-03-01 10:00:53 +0000 UTC" firstStartedPulling="2026-03-01 10:00:55.129533447 +0000 UTC m=+3184.371412644" lastFinishedPulling="2026-03-01 10:01:05.958691131 +0000 UTC m=+3195.200570318" observedRunningTime="2026-03-01 10:01:06.954230473 +0000 UTC m=+3196.196109670" watchObservedRunningTime="2026-03-01 10:01:07.985557338 +0000 UTC m=+3197.227436535" Mar 01 10:01:07 crc kubenswrapper[4792]: I0301 10:01:07.986768 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29539321-sclgm" podStartSLOduration=7.986764488 podStartE2EDuration="7.986764488s" podCreationTimestamp="2026-03-01 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 10:01:07.982891522 +0000 UTC m=+3197.224770719" watchObservedRunningTime="2026-03-01 10:01:07.986764488 +0000 UTC m=+3197.228643675" Mar 01 10:01:08 crc kubenswrapper[4792]: I0301 10:01:08.004887 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-79f8cb6d9d-xg7h5" podStartSLOduration=4.301035419 podStartE2EDuration="15.00487385s" podCreationTimestamp="2026-03-01 10:00:53 +0000 UTC" firstStartedPulling="2026-03-01 10:00:55.259448145 +0000 UTC m=+3184.501327342" lastFinishedPulling="2026-03-01 10:01:05.963286586 +0000 UTC m=+3195.205165773" observedRunningTime="2026-03-01 10:01:08.002954642 +0000 UTC m=+3197.244833849" watchObservedRunningTime="2026-03-01 10:01:08.00487385 +0000 UTC m=+3197.246753047" Mar 01 10:01:08 crc kubenswrapper[4792]: I0301 10:01:08.006775 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"52b189da-3327-40c1-bf22-a842b0980593","Type":"ContainerStarted","Data":"ff29ffdd6a80c6f635ebf305dac713755a6834f40a80f0b4fc3312233561d5b1"} Mar 01 10:01:08 crc kubenswrapper[4792]: I0301 10:01:08.006827 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"52b189da-3327-40c1-bf22-a842b0980593","Type":"ContainerStarted","Data":"bcb072fe43d6152d4fc47f391c97a3f0de0392a05202abc73c45febc53acaf52"} Mar 01 10:01:08 crc kubenswrapper[4792]: I0301 10:01:08.021744 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5887d74897-rnlz9" podUID="43f97fc5-fad3-4979-9a7d-2f24f9a8dac7" containerName="horizon-log" containerID="cri-o://b9cdb74c5cb376a6549f73b6d93daac9b503601025b178e429bb6ba097174373" gracePeriod=30 Mar 01 10:01:08 crc kubenswrapper[4792]: I0301 10:01:08.021860 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5887d74897-rnlz9" event={"ID":"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7","Type":"ContainerStarted","Data":"4236f2f0ade05ed729fae7f1ab3406c381b830b740b88994ee4a748285301095"} Mar 01 10:01:08 crc kubenswrapper[4792]: I0301 10:01:08.021932 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5887d74897-rnlz9" podUID="43f97fc5-fad3-4979-9a7d-2f24f9a8dac7" containerName="horizon" containerID="cri-o://4236f2f0ade05ed729fae7f1ab3406c381b830b740b88994ee4a748285301095" gracePeriod=30 Mar 01 10:01:08 crc kubenswrapper[4792]: I0301 10:01:08.057184 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=9.057160293 podStartE2EDuration="9.057160293s" podCreationTimestamp="2026-03-01 10:00:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 10:01:08.043714238 +0000 UTC m=+3197.285593445" watchObservedRunningTime="2026-03-01 10:01:08.057160293 +0000 UTC m=+3197.299039490" Mar 01 10:01:08 crc kubenswrapper[4792]: I0301 10:01:08.085939 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5887d74897-rnlz9" podStartSLOduration=3.596435919 podStartE2EDuration="17.08592169s" podCreationTimestamp="2026-03-01 10:00:51 +0000 UTC" firstStartedPulling="2026-03-01 10:00:52.375235238 +0000 UTC m=+3181.617114435" lastFinishedPulling="2026-03-01 10:01:05.864721009 +0000 UTC m=+3195.106600206" observedRunningTime="2026-03-01 10:01:08.072568807 +0000 UTC m=+3197.314448004" watchObservedRunningTime="2026-03-01 10:01:08.08592169 +0000 UTC m=+3197.327800887" Mar 01 10:01:09 crc kubenswrapper[4792]: I0301 10:01:09.035311 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9d055103-6c35-481f-820a-7aa363543404","Type":"ContainerStarted","Data":"95765fba596a2f1562b23cd8ba4b618f6eae542fa1c12361f390021023e3a630"} Mar 01 10:01:09 crc kubenswrapper[4792]: I0301 10:01:09.065447 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=10.065424023 podStartE2EDuration="10.065424023s" podCreationTimestamp="2026-03-01 10:00:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 10:01:09.054152682 +0000 UTC m=+3198.296031879" watchObservedRunningTime="2026-03-01 10:01:09.065424023 +0000 UTC m=+3198.307303220" Mar 01 10:01:09 crc kubenswrapper[4792]: I0301 10:01:09.491355 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 01 10:01:09 crc kubenswrapper[4792]: I0301 10:01:09.491406 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 01 10:01:09 crc kubenswrapper[4792]: I0301 10:01:09.537441 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 01 10:01:09 crc kubenswrapper[4792]: I0301 10:01:09.559736 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 01 10:01:10 crc kubenswrapper[4792]: I0301 10:01:10.041332 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 01 10:01:10 crc kubenswrapper[4792]: I0301 10:01:10.042739 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 01 10:01:10 crc kubenswrapper[4792]: I0301 10:01:10.203896 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 01 10:01:10 crc kubenswrapper[4792]: I0301 10:01:10.203966 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 01 10:01:10 crc kubenswrapper[4792]: I0301 10:01:10.243246 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 01 10:01:10 crc kubenswrapper[4792]: I0301 10:01:10.285068 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 01 10:01:11 crc kubenswrapper[4792]: I0301 10:01:11.049232 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 01 10:01:11 crc kubenswrapper[4792]: I0301 10:01:11.049287 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 01 10:01:11 crc kubenswrapper[4792]: I0301 10:01:11.458460 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:01:11 crc kubenswrapper[4792]: I0301 10:01:11.725278 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:01:12 crc kubenswrapper[4792]: I0301 10:01:12.059270 4792 generic.go:334] "Generic (PLEG): container finished" podID="7ec04609-b280-4df0-a0c5-2e4c7208c1c6" containerID="45c567ce10495b75abefb1478976cbc609499a5f59a4a0f633c7587b44678000" exitCode=0 Mar 01 10:01:12 crc kubenswrapper[4792]: I0301 10:01:12.060151 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29539321-sclgm" event={"ID":"7ec04609-b280-4df0-a0c5-2e4c7208c1c6","Type":"ContainerDied","Data":"45c567ce10495b75abefb1478976cbc609499a5f59a4a0f633c7587b44678000"} Mar 01 10:01:14 crc kubenswrapper[4792]: I0301 10:01:14.321877 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:01:14 crc kubenswrapper[4792]: I0301 10:01:14.322441 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:01:14 crc kubenswrapper[4792]: I0301 10:01:14.462082 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:01:14 crc kubenswrapper[4792]: I0301 10:01:14.462119 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:01:16 crc kubenswrapper[4792]: I0301 10:01:16.132580 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29539321-sclgm" event={"ID":"7ec04609-b280-4df0-a0c5-2e4c7208c1c6","Type":"ContainerDied","Data":"9eb42b80e67cd6ee111c52690140db4093810e4ed24e856419a0421b142b5cae"} Mar 01 10:01:16 crc kubenswrapper[4792]: I0301 10:01:16.132831 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9eb42b80e67cd6ee111c52690140db4093810e4ed24e856419a0421b142b5cae" Mar 01 10:01:16 crc kubenswrapper[4792]: I0301 10:01:16.217728 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29539321-sclgm" Mar 01 10:01:16 crc kubenswrapper[4792]: I0301 10:01:16.411164 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-fernet-keys\") pod \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\" (UID: \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\") " Mar 01 10:01:16 crc kubenswrapper[4792]: I0301 10:01:16.411274 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-config-data\") pod \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\" (UID: \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\") " Mar 01 10:01:16 crc kubenswrapper[4792]: I0301 10:01:16.411303 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72wdc\" (UniqueName: \"kubernetes.io/projected/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-kube-api-access-72wdc\") pod \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\" (UID: \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\") " Mar 01 10:01:16 crc kubenswrapper[4792]: I0301 10:01:16.411452 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-combined-ca-bundle\") pod \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\" (UID: \"7ec04609-b280-4df0-a0c5-2e4c7208c1c6\") " Mar 01 10:01:16 crc kubenswrapper[4792]: I0301 10:01:16.434627 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7ec04609-b280-4df0-a0c5-2e4c7208c1c6" (UID: "7ec04609-b280-4df0-a0c5-2e4c7208c1c6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:16 crc kubenswrapper[4792]: I0301 10:01:16.457106 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-kube-api-access-72wdc" (OuterVolumeSpecName: "kube-api-access-72wdc") pod "7ec04609-b280-4df0-a0c5-2e4c7208c1c6" (UID: "7ec04609-b280-4df0-a0c5-2e4c7208c1c6"). InnerVolumeSpecName "kube-api-access-72wdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:01:16 crc kubenswrapper[4792]: I0301 10:01:16.503065 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ec04609-b280-4df0-a0c5-2e4c7208c1c6" (UID: "7ec04609-b280-4df0-a0c5-2e4c7208c1c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:16 crc kubenswrapper[4792]: I0301 10:01:16.523367 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72wdc\" (UniqueName: \"kubernetes.io/projected/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-kube-api-access-72wdc\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:16 crc kubenswrapper[4792]: I0301 10:01:16.523400 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:16 crc kubenswrapper[4792]: I0301 10:01:16.523411 4792 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:16 crc kubenswrapper[4792]: I0301 10:01:16.535277 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-config-data" (OuterVolumeSpecName: "config-data") pod "7ec04609-b280-4df0-a0c5-2e4c7208c1c6" (UID: "7ec04609-b280-4df0-a0c5-2e4c7208c1c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:16 crc kubenswrapper[4792]: I0301 10:01:16.626326 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ec04609-b280-4df0-a0c5-2e4c7208c1c6-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:17 crc kubenswrapper[4792]: I0301 10:01:17.085351 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 01 10:01:17 crc kubenswrapper[4792]: I0301 10:01:17.115449 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 01 10:01:17 crc kubenswrapper[4792]: I0301 10:01:17.143899 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29539321-sclgm" Mar 01 10:01:17 crc kubenswrapper[4792]: I0301 10:01:17.145349 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-p2dtn" event={"ID":"01bf5dae-6217-4644-9c9b-65d3886a4dc1","Type":"ContainerStarted","Data":"a0b92e5e12c9f6c28a62c7a978997e469e5d99be007aba61824b2b6c8d62ffa5"} Mar 01 10:01:17 crc kubenswrapper[4792]: I0301 10:01:17.163925 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-p2dtn" podStartSLOduration=6.72130499 podStartE2EDuration="16.163892306s" podCreationTimestamp="2026-03-01 10:01:01 +0000 UTC" firstStartedPulling="2026-03-01 10:01:06.589448941 +0000 UTC m=+3195.831328138" lastFinishedPulling="2026-03-01 10:01:16.032036257 +0000 UTC m=+3205.273915454" observedRunningTime="2026-03-01 10:01:17.161304382 +0000 UTC m=+3206.403183579" watchObservedRunningTime="2026-03-01 10:01:17.163892306 +0000 UTC m=+3206.405771503" Mar 01 10:01:17 crc kubenswrapper[4792]: I0301 10:01:17.184454 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 01 10:01:19 crc kubenswrapper[4792]: I0301 10:01:19.289787 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 01 10:01:24 crc kubenswrapper[4792]: I0301 10:01:24.323258 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-689c76c966-7mbkl" podUID="67e060ef-1cc6-4b39-8622-bbcc183bdda0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.12:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.12:8443: connect: connection refused" Mar 01 10:01:24 crc kubenswrapper[4792]: I0301 10:01:24.464049 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-79f8cb6d9d-xg7h5" podUID="d7f79f77-ac1b-445e-8e28-85c8964f5461" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.13:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.13:8443: connect: connection refused" Mar 01 10:01:29 crc kubenswrapper[4792]: I0301 10:01:29.248980 4792 generic.go:334] "Generic (PLEG): container finished" podID="01bf5dae-6217-4644-9c9b-65d3886a4dc1" containerID="a0b92e5e12c9f6c28a62c7a978997e469e5d99be007aba61824b2b6c8d62ffa5" exitCode=0 Mar 01 10:01:29 crc kubenswrapper[4792]: I0301 10:01:29.249547 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-p2dtn" event={"ID":"01bf5dae-6217-4644-9c9b-65d3886a4dc1","Type":"ContainerDied","Data":"a0b92e5e12c9f6c28a62c7a978997e469e5d99be007aba61824b2b6c8d62ffa5"} Mar 01 10:01:30 crc kubenswrapper[4792]: I0301 10:01:30.986524 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-p2dtn" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.107783 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/01bf5dae-6217-4644-9c9b-65d3886a4dc1-job-config-data\") pod \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\" (UID: \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\") " Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.107879 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8pgg\" (UniqueName: \"kubernetes.io/projected/01bf5dae-6217-4644-9c9b-65d3886a4dc1-kube-api-access-n8pgg\") pod \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\" (UID: \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\") " Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.107907 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01bf5dae-6217-4644-9c9b-65d3886a4dc1-combined-ca-bundle\") pod \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\" (UID: \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\") " Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.107939 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01bf5dae-6217-4644-9c9b-65d3886a4dc1-config-data\") pod \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\" (UID: \"01bf5dae-6217-4644-9c9b-65d3886a4dc1\") " Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.134376 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01bf5dae-6217-4644-9c9b-65d3886a4dc1-kube-api-access-n8pgg" (OuterVolumeSpecName: "kube-api-access-n8pgg") pod "01bf5dae-6217-4644-9c9b-65d3886a4dc1" (UID: "01bf5dae-6217-4644-9c9b-65d3886a4dc1"). InnerVolumeSpecName "kube-api-access-n8pgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.140232 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01bf5dae-6217-4644-9c9b-65d3886a4dc1-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "01bf5dae-6217-4644-9c9b-65d3886a4dc1" (UID: "01bf5dae-6217-4644-9c9b-65d3886a4dc1"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.147481 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01bf5dae-6217-4644-9c9b-65d3886a4dc1-config-data" (OuterVolumeSpecName: "config-data") pod "01bf5dae-6217-4644-9c9b-65d3886a4dc1" (UID: "01bf5dae-6217-4644-9c9b-65d3886a4dc1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.151079 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01bf5dae-6217-4644-9c9b-65d3886a4dc1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01bf5dae-6217-4644-9c9b-65d3886a4dc1" (UID: "01bf5dae-6217-4644-9c9b-65d3886a4dc1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.210380 4792 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/01bf5dae-6217-4644-9c9b-65d3886a4dc1-job-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.210431 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8pgg\" (UniqueName: \"kubernetes.io/projected/01bf5dae-6217-4644-9c9b-65d3886a4dc1-kube-api-access-n8pgg\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.210447 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01bf5dae-6217-4644-9c9b-65d3886a4dc1-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.210455 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01bf5dae-6217-4644-9c9b-65d3886a4dc1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.269168 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-p2dtn" event={"ID":"01bf5dae-6217-4644-9c9b-65d3886a4dc1","Type":"ContainerDied","Data":"602a6322b970b1abc9872680d2570a6e260b8dc7e4033e670c6368378000d632"} Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.269524 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="602a6322b970b1abc9872680d2570a6e260b8dc7e4033e670c6368378000d632" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.269593 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-p2dtn" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.770133 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Mar 01 10:01:31 crc kubenswrapper[4792]: E0301 10:01:31.770491 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ec04609-b280-4df0-a0c5-2e4c7208c1c6" containerName="keystone-cron" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.770507 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ec04609-b280-4df0-a0c5-2e4c7208c1c6" containerName="keystone-cron" Mar 01 10:01:31 crc kubenswrapper[4792]: E0301 10:01:31.770534 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bf5dae-6217-4644-9c9b-65d3886a4dc1" containerName="manila-db-sync" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.770539 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bf5dae-6217-4644-9c9b-65d3886a4dc1" containerName="manila-db-sync" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.770722 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="01bf5dae-6217-4644-9c9b-65d3886a4dc1" containerName="manila-db-sync" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.770741 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ec04609-b280-4df0-a0c5-2e4c7208c1c6" containerName="keystone-cron" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.779689 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.790261 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.790489 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-x2pjp" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.790607 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.790715 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.823966 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.825572 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.832180 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.848566 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-config-data\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.848610 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-scripts\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.848654 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2e20671b-de40-40a8-8237-2bd9940b9af5-ceph\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.848709 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.848754 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/2e20671b-de40-40a8-8237-2bd9940b9af5-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.848774 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2e20671b-de40-40a8-8237-2bd9940b9af5-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.848818 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5f4d\" (UniqueName: \"kubernetes.io/projected/2e20671b-de40-40a8-8237-2bd9940b9af5-kube-api-access-r5f4d\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.848863 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.857099 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.882195 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.941232 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86c6bdcc4c-fqgkv"] Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.942708 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.952880 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-scripts\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.952961 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2e20671b-de40-40a8-8237-2bd9940b9af5-ceph\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.953005 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-scripts\") pod \"manila-scheduler-0\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.953024 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.953042 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.953072 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.953123 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/2e20671b-de40-40a8-8237-2bd9940b9af5-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.953140 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2e20671b-de40-40a8-8237-2bd9940b9af5-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.953157 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.953196 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5f4d\" (UniqueName: \"kubernetes.io/projected/2e20671b-de40-40a8-8237-2bd9940b9af5-kube-api-access-r5f4d\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.953215 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-config-data\") pod \"manila-scheduler-0\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.953246 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbl2x\" (UniqueName: \"kubernetes.io/projected/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-kube-api-access-qbl2x\") pod \"manila-scheduler-0\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.953272 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.953297 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-config-data\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.953502 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/2e20671b-de40-40a8-8237-2bd9940b9af5-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.954331 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2e20671b-de40-40a8-8237-2bd9940b9af5-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.966587 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-scripts\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.978514 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2e20671b-de40-40a8-8237-2bd9940b9af5-ceph\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.979278 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-config-data\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:31 crc kubenswrapper[4792]: I0301 10:01:31.993451 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5f4d\" (UniqueName: \"kubernetes.io/projected/2e20671b-de40-40a8-8237-2bd9940b9af5-kube-api-access-r5f4d\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.006986 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86c6bdcc4c-fqgkv"] Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.007604 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.009542 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " pod="openstack/manila-share-share1-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.057171 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.058894 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.061972 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/49541358-1fd0-4d1d-8b61-0c618994dfc0-openstack-edpm-ipam\") pod \"dnsmasq-dns-86c6bdcc4c-fqgkv\" (UID: \"49541358-1fd0-4d1d-8b61-0c618994dfc0\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.062016 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-scripts\") pod \"manila-scheduler-0\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.062040 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.062058 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.062098 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49541358-1fd0-4d1d-8b61-0c618994dfc0-ovsdbserver-nb\") pod \"dnsmasq-dns-86c6bdcc4c-fqgkv\" (UID: \"49541358-1fd0-4d1d-8b61-0c618994dfc0\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.062140 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.062167 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49541358-1fd0-4d1d-8b61-0c618994dfc0-dns-svc\") pod \"dnsmasq-dns-86c6bdcc4c-fqgkv\" (UID: \"49541358-1fd0-4d1d-8b61-0c618994dfc0\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.062189 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49541358-1fd0-4d1d-8b61-0c618994dfc0-config\") pod \"dnsmasq-dns-86c6bdcc4c-fqgkv\" (UID: \"49541358-1fd0-4d1d-8b61-0c618994dfc0\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.062211 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-config-data\") pod \"manila-scheduler-0\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.062241 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbl2x\" (UniqueName: \"kubernetes.io/projected/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-kube-api-access-qbl2x\") pod \"manila-scheduler-0\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.062290 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49541358-1fd0-4d1d-8b61-0c618994dfc0-ovsdbserver-sb\") pod \"dnsmasq-dns-86c6bdcc4c-fqgkv\" (UID: \"49541358-1fd0-4d1d-8b61-0c618994dfc0\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.062316 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxsmh\" (UniqueName: \"kubernetes.io/projected/49541358-1fd0-4d1d-8b61-0c618994dfc0-kube-api-access-rxsmh\") pod \"dnsmasq-dns-86c6bdcc4c-fqgkv\" (UID: \"49541358-1fd0-4d1d-8b61-0c618994dfc0\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.067622 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.068286 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.069189 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.074206 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.090364 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-scripts\") pod \"manila-scheduler-0\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.090747 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-config-data\") pod \"manila-scheduler-0\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.091258 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.120001 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.153547 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbl2x\" (UniqueName: \"kubernetes.io/projected/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-kube-api-access-qbl2x\") pod \"manila-scheduler-0\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.156255 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.165711 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xv8r\" (UniqueName: \"kubernetes.io/projected/992c179f-7feb-4441-b94a-81b52133f671-kube-api-access-6xv8r\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.165777 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49541358-1fd0-4d1d-8b61-0c618994dfc0-ovsdbserver-nb\") pod \"dnsmasq-dns-86c6bdcc4c-fqgkv\" (UID: \"49541358-1fd0-4d1d-8b61-0c618994dfc0\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.165830 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49541358-1fd0-4d1d-8b61-0c618994dfc0-dns-svc\") pod \"dnsmasq-dns-86c6bdcc4c-fqgkv\" (UID: \"49541358-1fd0-4d1d-8b61-0c618994dfc0\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.165858 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49541358-1fd0-4d1d-8b61-0c618994dfc0-config\") pod \"dnsmasq-dns-86c6bdcc4c-fqgkv\" (UID: \"49541358-1fd0-4d1d-8b61-0c618994dfc0\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.165881 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.165934 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-config-data\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.165949 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-scripts\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.165987 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49541358-1fd0-4d1d-8b61-0c618994dfc0-ovsdbserver-sb\") pod \"dnsmasq-dns-86c6bdcc4c-fqgkv\" (UID: \"49541358-1fd0-4d1d-8b61-0c618994dfc0\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.166005 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-config-data-custom\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.166029 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxsmh\" (UniqueName: \"kubernetes.io/projected/49541358-1fd0-4d1d-8b61-0c618994dfc0-kube-api-access-rxsmh\") pod \"dnsmasq-dns-86c6bdcc4c-fqgkv\" (UID: \"49541358-1fd0-4d1d-8b61-0c618994dfc0\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.166048 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/992c179f-7feb-4441-b94a-81b52133f671-etc-machine-id\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.166069 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/992c179f-7feb-4441-b94a-81b52133f671-logs\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.166096 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/49541358-1fd0-4d1d-8b61-0c618994dfc0-openstack-edpm-ipam\") pod \"dnsmasq-dns-86c6bdcc4c-fqgkv\" (UID: \"49541358-1fd0-4d1d-8b61-0c618994dfc0\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.167722 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49541358-1fd0-4d1d-8b61-0c618994dfc0-ovsdbserver-nb\") pod \"dnsmasq-dns-86c6bdcc4c-fqgkv\" (UID: \"49541358-1fd0-4d1d-8b61-0c618994dfc0\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.168243 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49541358-1fd0-4d1d-8b61-0c618994dfc0-dns-svc\") pod \"dnsmasq-dns-86c6bdcc4c-fqgkv\" (UID: \"49541358-1fd0-4d1d-8b61-0c618994dfc0\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.169070 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49541358-1fd0-4d1d-8b61-0c618994dfc0-config\") pod \"dnsmasq-dns-86c6bdcc4c-fqgkv\" (UID: \"49541358-1fd0-4d1d-8b61-0c618994dfc0\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.185259 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49541358-1fd0-4d1d-8b61-0c618994dfc0-ovsdbserver-sb\") pod \"dnsmasq-dns-86c6bdcc4c-fqgkv\" (UID: \"49541358-1fd0-4d1d-8b61-0c618994dfc0\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.206180 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/49541358-1fd0-4d1d-8b61-0c618994dfc0-openstack-edpm-ipam\") pod \"dnsmasq-dns-86c6bdcc4c-fqgkv\" (UID: \"49541358-1fd0-4d1d-8b61-0c618994dfc0\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.222679 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxsmh\" (UniqueName: \"kubernetes.io/projected/49541358-1fd0-4d1d-8b61-0c618994dfc0-kube-api-access-rxsmh\") pod \"dnsmasq-dns-86c6bdcc4c-fqgkv\" (UID: \"49541358-1fd0-4d1d-8b61-0c618994dfc0\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.271363 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-config-data\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.271421 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-scripts\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.271498 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-config-data-custom\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.271529 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/992c179f-7feb-4441-b94a-81b52133f671-etc-machine-id\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.271552 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/992c179f-7feb-4441-b94a-81b52133f671-logs\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.271595 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xv8r\" (UniqueName: \"kubernetes.io/projected/992c179f-7feb-4441-b94a-81b52133f671-kube-api-access-6xv8r\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.274406 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.278003 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/992c179f-7feb-4441-b94a-81b52133f671-etc-machine-id\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.278410 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/992c179f-7feb-4441-b94a-81b52133f671-logs\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.284513 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-config-data-custom\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.291658 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.291880 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-scripts\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.292774 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.304688 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-config-data\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.321208 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xv8r\" (UniqueName: \"kubernetes.io/projected/992c179f-7feb-4441-b94a-81b52133f671-kube-api-access-6xv8r\") pod \"manila-api-0\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " pod="openstack/manila-api-0" Mar 01 10:01:32 crc kubenswrapper[4792]: I0301 10:01:32.609402 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 01 10:01:33 crc kubenswrapper[4792]: I0301 10:01:33.199165 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 01 10:01:33 crc kubenswrapper[4792]: I0301 10:01:33.292938 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86c6bdcc4c-fqgkv"] Mar 01 10:01:33 crc kubenswrapper[4792]: I0301 10:01:33.394150 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38","Type":"ContainerStarted","Data":"bd049fc1e63654f23ff767894cd0f3d2ee5d142592fec5855ea0f40d653d992d"} Mar 01 10:01:33 crc kubenswrapper[4792]: I0301 10:01:33.403710 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" event={"ID":"49541358-1fd0-4d1d-8b61-0c618994dfc0","Type":"ContainerStarted","Data":"908de245b9fc63d7b33b773f418fa3e1ffa0e2698ef86729aa65356a6eba8160"} Mar 01 10:01:33 crc kubenswrapper[4792]: I0301 10:01:33.495880 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 01 10:01:33 crc kubenswrapper[4792]: I0301 10:01:33.608585 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 01 10:01:34 crc kubenswrapper[4792]: I0301 10:01:34.322009 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-689c76c966-7mbkl" podUID="67e060ef-1cc6-4b39-8622-bbcc183bdda0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.12:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.12:8443: connect: connection refused" Mar 01 10:01:34 crc kubenswrapper[4792]: I0301 10:01:34.421888 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"992c179f-7feb-4441-b94a-81b52133f671","Type":"ContainerStarted","Data":"b3828e251539f5a3929096bc64abd403423b14917056f3386127f41c90279e5b"} Mar 01 10:01:34 crc kubenswrapper[4792]: I0301 10:01:34.429692 4792 generic.go:334] "Generic (PLEG): container finished" podID="49541358-1fd0-4d1d-8b61-0c618994dfc0" containerID="bab38211efbbbcda82215e208b0cd4286972a7d51938b81d8c458f5923927746" exitCode=0 Mar 01 10:01:34 crc kubenswrapper[4792]: I0301 10:01:34.429767 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" event={"ID":"49541358-1fd0-4d1d-8b61-0c618994dfc0","Type":"ContainerDied","Data":"bab38211efbbbcda82215e208b0cd4286972a7d51938b81d8c458f5923927746"} Mar 01 10:01:34 crc kubenswrapper[4792]: I0301 10:01:34.445087 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"2e20671b-de40-40a8-8237-2bd9940b9af5","Type":"ContainerStarted","Data":"4f751b65cd13e8f9d7bbd49d2e059dd52782b4f71c9b59600e77ee78c852e20a"} Mar 01 10:01:34 crc kubenswrapper[4792]: I0301 10:01:34.464098 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-79f8cb6d9d-xg7h5" podUID="d7f79f77-ac1b-445e-8e28-85c8964f5461" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.13:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.13:8443: connect: connection refused" Mar 01 10:01:34 crc kubenswrapper[4792]: I0301 10:01:34.943675 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:01:34 crc kubenswrapper[4792]: I0301 10:01:34.944137 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:01:34 crc kubenswrapper[4792]: I0301 10:01:34.944179 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 10:01:34 crc kubenswrapper[4792]: I0301 10:01:34.945055 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b"} pod="openshift-machine-config-operator/machine-config-daemon-bqszv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 01 10:01:34 crc kubenswrapper[4792]: I0301 10:01:34.945119 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" containerID="cri-o://a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" gracePeriod=600 Mar 01 10:01:35 crc kubenswrapper[4792]: E0301 10:01:35.169003 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:01:35 crc kubenswrapper[4792]: I0301 10:01:35.487654 4792 generic.go:334] "Generic (PLEG): container finished" podID="9105f6b0-6f16-47aa-8009-73736a90b765" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" exitCode=0 Mar 01 10:01:35 crc kubenswrapper[4792]: I0301 10:01:35.488054 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerDied","Data":"a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b"} Mar 01 10:01:35 crc kubenswrapper[4792]: I0301 10:01:35.488089 4792 scope.go:117] "RemoveContainer" containerID="0c8bbdbe3553a3ed4617103bbd379245337bfad8f18870faba13af9b3c14caa1" Mar 01 10:01:35 crc kubenswrapper[4792]: I0301 10:01:35.488728 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:01:35 crc kubenswrapper[4792]: E0301 10:01:35.489030 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:01:35 crc kubenswrapper[4792]: I0301 10:01:35.503284 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"992c179f-7feb-4441-b94a-81b52133f671","Type":"ContainerStarted","Data":"6771ca6e038c525ca49561d70b9c91f98294b59eb490890e83179c43e3ef6385"} Mar 01 10:01:35 crc kubenswrapper[4792]: I0301 10:01:35.517838 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" event={"ID":"49541358-1fd0-4d1d-8b61-0c618994dfc0","Type":"ContainerStarted","Data":"260f9efb3b812a0d52fed5708cf6028ed15756fdebf443c912b3f22eea189dcb"} Mar 01 10:01:35 crc kubenswrapper[4792]: I0301 10:01:35.518372 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:35 crc kubenswrapper[4792]: I0301 10:01:35.562714 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" podStartSLOduration=4.562696494 podStartE2EDuration="4.562696494s" podCreationTimestamp="2026-03-01 10:01:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 10:01:35.554807397 +0000 UTC m=+3224.796686594" watchObservedRunningTime="2026-03-01 10:01:35.562696494 +0000 UTC m=+3224.804575681" Mar 01 10:01:35 crc kubenswrapper[4792]: I0301 10:01:35.815261 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Mar 01 10:01:36 crc kubenswrapper[4792]: I0301 10:01:36.566402 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"992c179f-7feb-4441-b94a-81b52133f671","Type":"ContainerStarted","Data":"6344b5116272aa3291056d2fadaea6c73aaf3b2724251ae1e478364c6553097b"} Mar 01 10:01:36 crc kubenswrapper[4792]: I0301 10:01:36.566798 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="992c179f-7feb-4441-b94a-81b52133f671" containerName="manila-api-log" containerID="cri-o://6771ca6e038c525ca49561d70b9c91f98294b59eb490890e83179c43e3ef6385" gracePeriod=30 Mar 01 10:01:36 crc kubenswrapper[4792]: I0301 10:01:36.567204 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Mar 01 10:01:36 crc kubenswrapper[4792]: I0301 10:01:36.567511 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="992c179f-7feb-4441-b94a-81b52133f671" containerName="manila-api" containerID="cri-o://6344b5116272aa3291056d2fadaea6c73aaf3b2724251ae1e478364c6553097b" gracePeriod=30 Mar 01 10:01:36 crc kubenswrapper[4792]: I0301 10:01:36.628053 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38","Type":"ContainerStarted","Data":"dde55c4e7a9e0e028ff7df99a2c399eb3840b164a780338d688539c0a3c4421a"} Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.584599 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.624466 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=6.624447081 podStartE2EDuration="6.624447081s" podCreationTimestamp="2026-03-01 10:01:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 10:01:36.597542457 +0000 UTC m=+3225.839421654" watchObservedRunningTime="2026-03-01 10:01:37.624447081 +0000 UTC m=+3226.866326278" Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.679551 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcmd2\" (UniqueName: \"kubernetes.io/projected/7026175e-efaf-497a-aaf1-079f2811ad08-kube-api-access-zcmd2\") pod \"7026175e-efaf-497a-aaf1-079f2811ad08\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.679603 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7026175e-efaf-497a-aaf1-079f2811ad08-config-data\") pod \"7026175e-efaf-497a-aaf1-079f2811ad08\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.679637 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7026175e-efaf-497a-aaf1-079f2811ad08-logs\") pod \"7026175e-efaf-497a-aaf1-079f2811ad08\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.680355 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7026175e-efaf-497a-aaf1-079f2811ad08-horizon-secret-key\") pod \"7026175e-efaf-497a-aaf1-079f2811ad08\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.680449 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7026175e-efaf-497a-aaf1-079f2811ad08-scripts\") pod \"7026175e-efaf-497a-aaf1-079f2811ad08\" (UID: \"7026175e-efaf-497a-aaf1-079f2811ad08\") " Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.680814 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38","Type":"ContainerStarted","Data":"06366ec6e532eeea2e6a301ab0ca2d1e855bb93e6c5118bc9e3b934abde5eb98"} Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.688126 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7026175e-efaf-497a-aaf1-079f2811ad08-logs" (OuterVolumeSpecName: "logs") pod "7026175e-efaf-497a-aaf1-079f2811ad08" (UID: "7026175e-efaf-497a-aaf1-079f2811ad08"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.728859 4792 generic.go:334] "Generic (PLEG): container finished" podID="7026175e-efaf-497a-aaf1-079f2811ad08" containerID="54af972835bf1ffc5525cfd1de0f83cd8659d1c005a950a21016d293f0b51bac" exitCode=137 Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.728898 4792 generic.go:334] "Generic (PLEG): container finished" podID="7026175e-efaf-497a-aaf1-079f2811ad08" containerID="b6c7806f191bce291b1fc9008b4e9e5c2b3a84dce2dc94634b54dd102f025baa" exitCode=137 Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.729019 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5996ddfbb9-drpwn" event={"ID":"7026175e-efaf-497a-aaf1-079f2811ad08","Type":"ContainerDied","Data":"54af972835bf1ffc5525cfd1de0f83cd8659d1c005a950a21016d293f0b51bac"} Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.729048 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5996ddfbb9-drpwn" event={"ID":"7026175e-efaf-497a-aaf1-079f2811ad08","Type":"ContainerDied","Data":"b6c7806f191bce291b1fc9008b4e9e5c2b3a84dce2dc94634b54dd102f025baa"} Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.729058 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5996ddfbb9-drpwn" event={"ID":"7026175e-efaf-497a-aaf1-079f2811ad08","Type":"ContainerDied","Data":"cdd7f60094a6b70916ce6fb3a1317938714cd27149f604e9df29f382e04185b2"} Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.729104 4792 scope.go:117] "RemoveContainer" containerID="54af972835bf1ffc5525cfd1de0f83cd8659d1c005a950a21016d293f0b51bac" Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.729700 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=5.385704516 podStartE2EDuration="6.729660973s" podCreationTimestamp="2026-03-01 10:01:31 +0000 UTC" firstStartedPulling="2026-03-01 10:01:33.217093603 +0000 UTC m=+3222.458972800" lastFinishedPulling="2026-03-01 10:01:34.56105006 +0000 UTC m=+3223.802929257" observedRunningTime="2026-03-01 10:01:37.716720371 +0000 UTC m=+3226.958599568" watchObservedRunningTime="2026-03-01 10:01:37.729660973 +0000 UTC m=+3226.971540170" Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.729759 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5996ddfbb9-drpwn" Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.733781 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7026175e-efaf-497a-aaf1-079f2811ad08-kube-api-access-zcmd2" (OuterVolumeSpecName: "kube-api-access-zcmd2") pod "7026175e-efaf-497a-aaf1-079f2811ad08" (UID: "7026175e-efaf-497a-aaf1-079f2811ad08"). InnerVolumeSpecName "kube-api-access-zcmd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.737256 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7026175e-efaf-497a-aaf1-079f2811ad08-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "7026175e-efaf-497a-aaf1-079f2811ad08" (UID: "7026175e-efaf-497a-aaf1-079f2811ad08"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.765084 4792 generic.go:334] "Generic (PLEG): container finished" podID="992c179f-7feb-4441-b94a-81b52133f671" containerID="6344b5116272aa3291056d2fadaea6c73aaf3b2724251ae1e478364c6553097b" exitCode=0 Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.765389 4792 generic.go:334] "Generic (PLEG): container finished" podID="992c179f-7feb-4441-b94a-81b52133f671" containerID="6771ca6e038c525ca49561d70b9c91f98294b59eb490890e83179c43e3ef6385" exitCode=143 Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.765414 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"992c179f-7feb-4441-b94a-81b52133f671","Type":"ContainerDied","Data":"6344b5116272aa3291056d2fadaea6c73aaf3b2724251ae1e478364c6553097b"} Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.765439 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"992c179f-7feb-4441-b94a-81b52133f671","Type":"ContainerDied","Data":"6771ca6e038c525ca49561d70b9c91f98294b59eb490890e83179c43e3ef6385"} Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.780691 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7026175e-efaf-497a-aaf1-079f2811ad08-config-data" (OuterVolumeSpecName: "config-data") pod "7026175e-efaf-497a-aaf1-079f2811ad08" (UID: "7026175e-efaf-497a-aaf1-079f2811ad08"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.783702 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7026175e-efaf-497a-aaf1-079f2811ad08-scripts" (OuterVolumeSpecName: "scripts") pod "7026175e-efaf-497a-aaf1-079f2811ad08" (UID: "7026175e-efaf-497a-aaf1-079f2811ad08"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.786393 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcmd2\" (UniqueName: \"kubernetes.io/projected/7026175e-efaf-497a-aaf1-079f2811ad08-kube-api-access-zcmd2\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.786414 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7026175e-efaf-497a-aaf1-079f2811ad08-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.786424 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7026175e-efaf-497a-aaf1-079f2811ad08-logs\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.786435 4792 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7026175e-efaf-497a-aaf1-079f2811ad08-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:37 crc kubenswrapper[4792]: I0301 10:01:37.786468 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7026175e-efaf-497a-aaf1-079f2811ad08-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.069513 4792 scope.go:117] "RemoveContainer" containerID="b6c7806f191bce291b1fc9008b4e9e5c2b3a84dce2dc94634b54dd102f025baa" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.308747 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.341972 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5996ddfbb9-drpwn"] Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.369403 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5996ddfbb9-drpwn"] Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.373529 4792 scope.go:117] "RemoveContainer" containerID="54af972835bf1ffc5525cfd1de0f83cd8659d1c005a950a21016d293f0b51bac" Mar 01 10:01:38 crc kubenswrapper[4792]: E0301 10:01:38.391060 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54af972835bf1ffc5525cfd1de0f83cd8659d1c005a950a21016d293f0b51bac\": container with ID starting with 54af972835bf1ffc5525cfd1de0f83cd8659d1c005a950a21016d293f0b51bac not found: ID does not exist" containerID="54af972835bf1ffc5525cfd1de0f83cd8659d1c005a950a21016d293f0b51bac" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.391108 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54af972835bf1ffc5525cfd1de0f83cd8659d1c005a950a21016d293f0b51bac"} err="failed to get container status \"54af972835bf1ffc5525cfd1de0f83cd8659d1c005a950a21016d293f0b51bac\": rpc error: code = NotFound desc = could not find container \"54af972835bf1ffc5525cfd1de0f83cd8659d1c005a950a21016d293f0b51bac\": container with ID starting with 54af972835bf1ffc5525cfd1de0f83cd8659d1c005a950a21016d293f0b51bac not found: ID does not exist" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.391135 4792 scope.go:117] "RemoveContainer" containerID="b6c7806f191bce291b1fc9008b4e9e5c2b3a84dce2dc94634b54dd102f025baa" Mar 01 10:01:38 crc kubenswrapper[4792]: E0301 10:01:38.398021 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6c7806f191bce291b1fc9008b4e9e5c2b3a84dce2dc94634b54dd102f025baa\": container with ID starting with b6c7806f191bce291b1fc9008b4e9e5c2b3a84dce2dc94634b54dd102f025baa not found: ID does not exist" containerID="b6c7806f191bce291b1fc9008b4e9e5c2b3a84dce2dc94634b54dd102f025baa" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.398054 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6c7806f191bce291b1fc9008b4e9e5c2b3a84dce2dc94634b54dd102f025baa"} err="failed to get container status \"b6c7806f191bce291b1fc9008b4e9e5c2b3a84dce2dc94634b54dd102f025baa\": rpc error: code = NotFound desc = could not find container \"b6c7806f191bce291b1fc9008b4e9e5c2b3a84dce2dc94634b54dd102f025baa\": container with ID starting with b6c7806f191bce291b1fc9008b4e9e5c2b3a84dce2dc94634b54dd102f025baa not found: ID does not exist" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.398075 4792 scope.go:117] "RemoveContainer" containerID="54af972835bf1ffc5525cfd1de0f83cd8659d1c005a950a21016d293f0b51bac" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.398597 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54af972835bf1ffc5525cfd1de0f83cd8659d1c005a950a21016d293f0b51bac"} err="failed to get container status \"54af972835bf1ffc5525cfd1de0f83cd8659d1c005a950a21016d293f0b51bac\": rpc error: code = NotFound desc = could not find container \"54af972835bf1ffc5525cfd1de0f83cd8659d1c005a950a21016d293f0b51bac\": container with ID starting with 54af972835bf1ffc5525cfd1de0f83cd8659d1c005a950a21016d293f0b51bac not found: ID does not exist" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.398630 4792 scope.go:117] "RemoveContainer" containerID="b6c7806f191bce291b1fc9008b4e9e5c2b3a84dce2dc94634b54dd102f025baa" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.398849 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6c7806f191bce291b1fc9008b4e9e5c2b3a84dce2dc94634b54dd102f025baa"} err="failed to get container status \"b6c7806f191bce291b1fc9008b4e9e5c2b3a84dce2dc94634b54dd102f025baa\": rpc error: code = NotFound desc = could not find container \"b6c7806f191bce291b1fc9008b4e9e5c2b3a84dce2dc94634b54dd102f025baa\": container with ID starting with b6c7806f191bce291b1fc9008b4e9e5c2b3a84dce2dc94634b54dd102f025baa not found: ID does not exist" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.409498 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-config-data-custom\") pod \"992c179f-7feb-4441-b94a-81b52133f671\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.409640 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xv8r\" (UniqueName: \"kubernetes.io/projected/992c179f-7feb-4441-b94a-81b52133f671-kube-api-access-6xv8r\") pod \"992c179f-7feb-4441-b94a-81b52133f671\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.409678 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/992c179f-7feb-4441-b94a-81b52133f671-etc-machine-id\") pod \"992c179f-7feb-4441-b94a-81b52133f671\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.409774 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/992c179f-7feb-4441-b94a-81b52133f671-logs\") pod \"992c179f-7feb-4441-b94a-81b52133f671\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.409838 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-config-data\") pod \"992c179f-7feb-4441-b94a-81b52133f671\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.409881 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-combined-ca-bundle\") pod \"992c179f-7feb-4441-b94a-81b52133f671\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.409896 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-scripts\") pod \"992c179f-7feb-4441-b94a-81b52133f671\" (UID: \"992c179f-7feb-4441-b94a-81b52133f671\") " Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.413122 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/992c179f-7feb-4441-b94a-81b52133f671-logs" (OuterVolumeSpecName: "logs") pod "992c179f-7feb-4441-b94a-81b52133f671" (UID: "992c179f-7feb-4441-b94a-81b52133f671"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.414681 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/992c179f-7feb-4441-b94a-81b52133f671-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "992c179f-7feb-4441-b94a-81b52133f671" (UID: "992c179f-7feb-4441-b94a-81b52133f671"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.437050 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "992c179f-7feb-4441-b94a-81b52133f671" (UID: "992c179f-7feb-4441-b94a-81b52133f671"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.451416 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/992c179f-7feb-4441-b94a-81b52133f671-kube-api-access-6xv8r" (OuterVolumeSpecName: "kube-api-access-6xv8r") pod "992c179f-7feb-4441-b94a-81b52133f671" (UID: "992c179f-7feb-4441-b94a-81b52133f671"). InnerVolumeSpecName "kube-api-access-6xv8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.473303 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-scripts" (OuterVolumeSpecName: "scripts") pod "992c179f-7feb-4441-b94a-81b52133f671" (UID: "992c179f-7feb-4441-b94a-81b52133f671"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.512088 4792 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/992c179f-7feb-4441-b94a-81b52133f671-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.512116 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/992c179f-7feb-4441-b94a-81b52133f671-logs\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.512127 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.512135 4792 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.512146 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xv8r\" (UniqueName: \"kubernetes.io/projected/992c179f-7feb-4441-b94a-81b52133f671-kube-api-access-6xv8r\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.568974 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-config-data" (OuterVolumeSpecName: "config-data") pod "992c179f-7feb-4441-b94a-81b52133f671" (UID: "992c179f-7feb-4441-b94a-81b52133f671"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.584612 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "992c179f-7feb-4441-b94a-81b52133f671" (UID: "992c179f-7feb-4441-b94a-81b52133f671"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.614109 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.614135 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/992c179f-7feb-4441-b94a-81b52133f671-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.758601 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.793788 4792 generic.go:334] "Generic (PLEG): container finished" podID="43f97fc5-fad3-4979-9a7d-2f24f9a8dac7" containerID="4236f2f0ade05ed729fae7f1ab3406c381b830b740b88994ee4a748285301095" exitCode=137 Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.793819 4792 generic.go:334] "Generic (PLEG): container finished" podID="43f97fc5-fad3-4979-9a7d-2f24f9a8dac7" containerID="b9cdb74c5cb376a6549f73b6d93daac9b503601025b178e429bb6ba097174373" exitCode=137 Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.793854 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5887d74897-rnlz9" event={"ID":"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7","Type":"ContainerDied","Data":"4236f2f0ade05ed729fae7f1ab3406c381b830b740b88994ee4a748285301095"} Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.793878 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5887d74897-rnlz9" event={"ID":"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7","Type":"ContainerDied","Data":"b9cdb74c5cb376a6549f73b6d93daac9b503601025b178e429bb6ba097174373"} Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.793888 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5887d74897-rnlz9" event={"ID":"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7","Type":"ContainerDied","Data":"f6db483fc0d11feaf9f5fa34256dfeb85c12b4ea3626fb137a55f680d9cb8957"} Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.793902 4792 scope.go:117] "RemoveContainer" containerID="4236f2f0ade05ed729fae7f1ab3406c381b830b740b88994ee4a748285301095" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.794169 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5887d74897-rnlz9" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.800350 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.810625 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"992c179f-7feb-4441-b94a-81b52133f671","Type":"ContainerDied","Data":"b3828e251539f5a3929096bc64abd403423b14917056f3386127f41c90279e5b"} Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.820065 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-config-data\") pod \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.820112 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-logs\") pod \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.820152 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-scripts\") pod \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.820184 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-horizon-secret-key\") pod \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.820207 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pf4f\" (UniqueName: \"kubernetes.io/projected/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-kube-api-access-2pf4f\") pod \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\" (UID: \"43f97fc5-fad3-4979-9a7d-2f24f9a8dac7\") " Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.822534 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-logs" (OuterVolumeSpecName: "logs") pod "43f97fc5-fad3-4979-9a7d-2f24f9a8dac7" (UID: "43f97fc5-fad3-4979-9a7d-2f24f9a8dac7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.839170 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-kube-api-access-2pf4f" (OuterVolumeSpecName: "kube-api-access-2pf4f") pod "43f97fc5-fad3-4979-9a7d-2f24f9a8dac7" (UID: "43f97fc5-fad3-4979-9a7d-2f24f9a8dac7"). InnerVolumeSpecName "kube-api-access-2pf4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.852894 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.858125 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "43f97fc5-fad3-4979-9a7d-2f24f9a8dac7" (UID: "43f97fc5-fad3-4979-9a7d-2f24f9a8dac7"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.865940 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.904604 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-scripts" (OuterVolumeSpecName: "scripts") pod "43f97fc5-fad3-4979-9a7d-2f24f9a8dac7" (UID: "43f97fc5-fad3-4979-9a7d-2f24f9a8dac7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.910088 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Mar 01 10:01:38 crc kubenswrapper[4792]: E0301 10:01:38.910564 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7026175e-efaf-497a-aaf1-079f2811ad08" containerName="horizon" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.910585 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7026175e-efaf-497a-aaf1-079f2811ad08" containerName="horizon" Mar 01 10:01:38 crc kubenswrapper[4792]: E0301 10:01:38.910599 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="992c179f-7feb-4441-b94a-81b52133f671" containerName="manila-api" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.910607 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="992c179f-7feb-4441-b94a-81b52133f671" containerName="manila-api" Mar 01 10:01:38 crc kubenswrapper[4792]: E0301 10:01:38.910624 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43f97fc5-fad3-4979-9a7d-2f24f9a8dac7" containerName="horizon" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.910631 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="43f97fc5-fad3-4979-9a7d-2f24f9a8dac7" containerName="horizon" Mar 01 10:01:38 crc kubenswrapper[4792]: E0301 10:01:38.910644 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43f97fc5-fad3-4979-9a7d-2f24f9a8dac7" containerName="horizon-log" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.910651 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="43f97fc5-fad3-4979-9a7d-2f24f9a8dac7" containerName="horizon-log" Mar 01 10:01:38 crc kubenswrapper[4792]: E0301 10:01:38.910673 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="992c179f-7feb-4441-b94a-81b52133f671" containerName="manila-api-log" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.910681 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="992c179f-7feb-4441-b94a-81b52133f671" containerName="manila-api-log" Mar 01 10:01:38 crc kubenswrapper[4792]: E0301 10:01:38.910691 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7026175e-efaf-497a-aaf1-079f2811ad08" containerName="horizon-log" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.910699 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7026175e-efaf-497a-aaf1-079f2811ad08" containerName="horizon-log" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.912024 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="992c179f-7feb-4441-b94a-81b52133f671" containerName="manila-api" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.912052 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="43f97fc5-fad3-4979-9a7d-2f24f9a8dac7" containerName="horizon" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.912064 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7026175e-efaf-497a-aaf1-079f2811ad08" containerName="horizon" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.912075 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="992c179f-7feb-4441-b94a-81b52133f671" containerName="manila-api-log" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.912090 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="43f97fc5-fad3-4979-9a7d-2f24f9a8dac7" containerName="horizon-log" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.912108 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7026175e-efaf-497a-aaf1-079f2811ad08" containerName="horizon-log" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.913125 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.918270 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.918446 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.918549 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.924935 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-scripts\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.924977 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-logs\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.925009 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-etc-machine-id\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.925031 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-internal-tls-certs\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.925051 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-config-data-custom\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.925092 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttblj\" (UniqueName: \"kubernetes.io/projected/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-kube-api-access-ttblj\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.925224 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-config-data\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.925269 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.925290 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-public-tls-certs\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.925354 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-logs\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.925366 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.925376 4792 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.925387 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pf4f\" (UniqueName: \"kubernetes.io/projected/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-kube-api-access-2pf4f\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:38 crc kubenswrapper[4792]: I0301 10:01:38.949452 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.005633 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-config-data" (OuterVolumeSpecName: "config-data") pod "43f97fc5-fad3-4979-9a7d-2f24f9a8dac7" (UID: "43f97fc5-fad3-4979-9a7d-2f24f9a8dac7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.036347 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-etc-machine-id\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.038300 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-etc-machine-id\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.038317 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-internal-tls-certs\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.038368 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-config-data-custom\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.038427 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttblj\" (UniqueName: \"kubernetes.io/projected/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-kube-api-access-ttblj\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.038587 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-config-data\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.038676 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.038707 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-public-tls-certs\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.038789 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-scripts\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.038818 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-logs\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.040782 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.041138 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-logs\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.043002 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-config-data-custom\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.049730 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.051898 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-scripts\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.052353 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-public-tls-certs\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.052735 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-internal-tls-certs\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.058863 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-config-data\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.065371 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttblj\" (UniqueName: \"kubernetes.io/projected/e6660fdc-5636-44ec-b6c0-e0e417d72e8a-kube-api-access-ttblj\") pod \"manila-api-0\" (UID: \"e6660fdc-5636-44ec-b6c0-e0e417d72e8a\") " pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.087201 4792 scope.go:117] "RemoveContainer" containerID="b9cdb74c5cb376a6549f73b6d93daac9b503601025b178e429bb6ba097174373" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.160564 4792 scope.go:117] "RemoveContainer" containerID="4236f2f0ade05ed729fae7f1ab3406c381b830b740b88994ee4a748285301095" Mar 01 10:01:39 crc kubenswrapper[4792]: E0301 10:01:39.162263 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4236f2f0ade05ed729fae7f1ab3406c381b830b740b88994ee4a748285301095\": container with ID starting with 4236f2f0ade05ed729fae7f1ab3406c381b830b740b88994ee4a748285301095 not found: ID does not exist" containerID="4236f2f0ade05ed729fae7f1ab3406c381b830b740b88994ee4a748285301095" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.162306 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4236f2f0ade05ed729fae7f1ab3406c381b830b740b88994ee4a748285301095"} err="failed to get container status \"4236f2f0ade05ed729fae7f1ab3406c381b830b740b88994ee4a748285301095\": rpc error: code = NotFound desc = could not find container \"4236f2f0ade05ed729fae7f1ab3406c381b830b740b88994ee4a748285301095\": container with ID starting with 4236f2f0ade05ed729fae7f1ab3406c381b830b740b88994ee4a748285301095 not found: ID does not exist" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.162335 4792 scope.go:117] "RemoveContainer" containerID="b9cdb74c5cb376a6549f73b6d93daac9b503601025b178e429bb6ba097174373" Mar 01 10:01:39 crc kubenswrapper[4792]: E0301 10:01:39.164347 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9cdb74c5cb376a6549f73b6d93daac9b503601025b178e429bb6ba097174373\": container with ID starting with b9cdb74c5cb376a6549f73b6d93daac9b503601025b178e429bb6ba097174373 not found: ID does not exist" containerID="b9cdb74c5cb376a6549f73b6d93daac9b503601025b178e429bb6ba097174373" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.164374 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9cdb74c5cb376a6549f73b6d93daac9b503601025b178e429bb6ba097174373"} err="failed to get container status \"b9cdb74c5cb376a6549f73b6d93daac9b503601025b178e429bb6ba097174373\": rpc error: code = NotFound desc = could not find container \"b9cdb74c5cb376a6549f73b6d93daac9b503601025b178e429bb6ba097174373\": container with ID starting with b9cdb74c5cb376a6549f73b6d93daac9b503601025b178e429bb6ba097174373 not found: ID does not exist" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.164387 4792 scope.go:117] "RemoveContainer" containerID="4236f2f0ade05ed729fae7f1ab3406c381b830b740b88994ee4a748285301095" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.164560 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4236f2f0ade05ed729fae7f1ab3406c381b830b740b88994ee4a748285301095"} err="failed to get container status \"4236f2f0ade05ed729fae7f1ab3406c381b830b740b88994ee4a748285301095\": rpc error: code = NotFound desc = could not find container \"4236f2f0ade05ed729fae7f1ab3406c381b830b740b88994ee4a748285301095\": container with ID starting with 4236f2f0ade05ed729fae7f1ab3406c381b830b740b88994ee4a748285301095 not found: ID does not exist" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.164578 4792 scope.go:117] "RemoveContainer" containerID="b9cdb74c5cb376a6549f73b6d93daac9b503601025b178e429bb6ba097174373" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.164749 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9cdb74c5cb376a6549f73b6d93daac9b503601025b178e429bb6ba097174373"} err="failed to get container status \"b9cdb74c5cb376a6549f73b6d93daac9b503601025b178e429bb6ba097174373\": rpc error: code = NotFound desc = could not find container \"b9cdb74c5cb376a6549f73b6d93daac9b503601025b178e429bb6ba097174373\": container with ID starting with b9cdb74c5cb376a6549f73b6d93daac9b503601025b178e429bb6ba097174373 not found: ID does not exist" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.164769 4792 scope.go:117] "RemoveContainer" containerID="6344b5116272aa3291056d2fadaea6c73aaf3b2724251ae1e478364c6553097b" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.176396 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5887d74897-rnlz9"] Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.186051 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5887d74897-rnlz9"] Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.207052 4792 scope.go:117] "RemoveContainer" containerID="6771ca6e038c525ca49561d70b9c91f98294b59eb490890e83179c43e3ef6385" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.249669 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.438675 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43f97fc5-fad3-4979-9a7d-2f24f9a8dac7" path="/var/lib/kubelet/pods/43f97fc5-fad3-4979-9a7d-2f24f9a8dac7/volumes" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.450165 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7026175e-efaf-497a-aaf1-079f2811ad08" path="/var/lib/kubelet/pods/7026175e-efaf-497a-aaf1-079f2811ad08/volumes" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.450780 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="992c179f-7feb-4441-b94a-81b52133f671" path="/var/lib/kubelet/pods/992c179f-7feb-4441-b94a-81b52133f671/volumes" Mar 01 10:01:39 crc kubenswrapper[4792]: I0301 10:01:39.906264 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 01 10:01:40 crc kubenswrapper[4792]: I0301 10:01:40.837052 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"e6660fdc-5636-44ec-b6c0-e0e417d72e8a","Type":"ContainerStarted","Data":"2391d1418b109063f4ee67e28ee9b1de55f0fc32895676092aed9ef9a77d1746"} Mar 01 10:01:40 crc kubenswrapper[4792]: I0301 10:01:40.837565 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"e6660fdc-5636-44ec-b6c0-e0e417d72e8a","Type":"ContainerStarted","Data":"a8f7036e4a049e5741f848ea5e0b58d2f8611e1e0df796b9d649218a1f4571c5"} Mar 01 10:01:41 crc kubenswrapper[4792]: I0301 10:01:41.851679 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"e6660fdc-5636-44ec-b6c0-e0e417d72e8a","Type":"ContainerStarted","Data":"d2d356db92cc8311cc9321284f1aa2bc6fa39f9e2291eb803eace6b30a6f0334"} Mar 01 10:01:41 crc kubenswrapper[4792]: I0301 10:01:41.852102 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Mar 01 10:01:41 crc kubenswrapper[4792]: I0301 10:01:41.903324 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.9033084369999997 podStartE2EDuration="3.903308437s" podCreationTimestamp="2026-03-01 10:01:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 10:01:41.897559473 +0000 UTC m=+3231.139438670" watchObservedRunningTime="2026-03-01 10:01:41.903308437 +0000 UTC m=+3231.145187644" Mar 01 10:01:42 crc kubenswrapper[4792]: I0301 10:01:42.159060 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Mar 01 10:01:42 crc kubenswrapper[4792]: I0301 10:01:42.295079 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86c6bdcc4c-fqgkv" Mar 01 10:01:42 crc kubenswrapper[4792]: I0301 10:01:42.405779 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cb7494899-9x44w"] Mar 01 10:01:42 crc kubenswrapper[4792]: I0301 10:01:42.406064 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cb7494899-9x44w" podUID="a3a408d8-0510-4867-8517-e609d614a5d2" containerName="dnsmasq-dns" containerID="cri-o://8b04ed1bd42aa863eef16fe08f4819a294c0cd44d80b6329225717f3d7d610c0" gracePeriod=10 Mar 01 10:01:42 crc kubenswrapper[4792]: I0301 10:01:42.863304 4792 generic.go:334] "Generic (PLEG): container finished" podID="a3a408d8-0510-4867-8517-e609d614a5d2" containerID="8b04ed1bd42aa863eef16fe08f4819a294c0cd44d80b6329225717f3d7d610c0" exitCode=0 Mar 01 10:01:42 crc kubenswrapper[4792]: I0301 10:01:42.864547 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb7494899-9x44w" event={"ID":"a3a408d8-0510-4867-8517-e609d614a5d2","Type":"ContainerDied","Data":"8b04ed1bd42aa863eef16fe08f4819a294c0cd44d80b6329225717f3d7d610c0"} Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.078108 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.146332 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-ovsdbserver-sb\") pod \"a3a408d8-0510-4867-8517-e609d614a5d2\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.146640 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-dns-svc\") pod \"a3a408d8-0510-4867-8517-e609d614a5d2\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.146674 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-ovsdbserver-nb\") pod \"a3a408d8-0510-4867-8517-e609d614a5d2\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.146737 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjnt9\" (UniqueName: \"kubernetes.io/projected/a3a408d8-0510-4867-8517-e609d614a5d2-kube-api-access-cjnt9\") pod \"a3a408d8-0510-4867-8517-e609d614a5d2\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.146774 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-openstack-edpm-ipam\") pod \"a3a408d8-0510-4867-8517-e609d614a5d2\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.146790 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-config\") pod \"a3a408d8-0510-4867-8517-e609d614a5d2\" (UID: \"a3a408d8-0510-4867-8517-e609d614a5d2\") " Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.175219 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3a408d8-0510-4867-8517-e609d614a5d2-kube-api-access-cjnt9" (OuterVolumeSpecName: "kube-api-access-cjnt9") pod "a3a408d8-0510-4867-8517-e609d614a5d2" (UID: "a3a408d8-0510-4867-8517-e609d614a5d2"). InnerVolumeSpecName "kube-api-access-cjnt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.230658 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a3a408d8-0510-4867-8517-e609d614a5d2" (UID: "a3a408d8-0510-4867-8517-e609d614a5d2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.260384 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjnt9\" (UniqueName: \"kubernetes.io/projected/a3a408d8-0510-4867-8517-e609d614a5d2-kube-api-access-cjnt9\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.260422 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.263772 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "a3a408d8-0510-4867-8517-e609d614a5d2" (UID: "a3a408d8-0510-4867-8517-e609d614a5d2"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.265682 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-config" (OuterVolumeSpecName: "config") pod "a3a408d8-0510-4867-8517-e609d614a5d2" (UID: "a3a408d8-0510-4867-8517-e609d614a5d2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.265887 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a3a408d8-0510-4867-8517-e609d614a5d2" (UID: "a3a408d8-0510-4867-8517-e609d614a5d2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.273574 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a3a408d8-0510-4867-8517-e609d614a5d2" (UID: "a3a408d8-0510-4867-8517-e609d614a5d2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.363758 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.363787 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.363798 4792 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.363807 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3a408d8-0510-4867-8517-e609d614a5d2-config\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.877736 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb7494899-9x44w" event={"ID":"a3a408d8-0510-4867-8517-e609d614a5d2","Type":"ContainerDied","Data":"3830dbf250a4bedcc384f157cb50c2184899780d10921fec2974544052904874"} Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.877784 4792 scope.go:117] "RemoveContainer" containerID="8b04ed1bd42aa863eef16fe08f4819a294c0cd44d80b6329225717f3d7d610c0" Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.877799 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb7494899-9x44w" Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.916377 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cb7494899-9x44w"] Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.935001 4792 scope.go:117] "RemoveContainer" containerID="9c10172b37a7ca756da9dc968292ad3961c9a1084ca116b725c3c50da7e6ecd8" Mar 01 10:01:43 crc kubenswrapper[4792]: I0301 10:01:43.935173 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cb7494899-9x44w"] Mar 01 10:01:45 crc kubenswrapper[4792]: I0301 10:01:45.427873 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3a408d8-0510-4867-8517-e609d614a5d2" path="/var/lib/kubelet/pods/a3a408d8-0510-4867-8517-e609d614a5d2/volumes" Mar 01 10:01:47 crc kubenswrapper[4792]: I0301 10:01:47.334938 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 10:01:47 crc kubenswrapper[4792]: I0301 10:01:47.335481 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a4871267-e63c-4804-a404-869a0fdbd171" containerName="ceilometer-central-agent" containerID="cri-o://3d636a3af1c36dd5782bf30fc9b5448d960eda4cadc3be15e2341f14ec6c7b14" gracePeriod=30 Mar 01 10:01:47 crc kubenswrapper[4792]: I0301 10:01:47.335783 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a4871267-e63c-4804-a404-869a0fdbd171" containerName="proxy-httpd" containerID="cri-o://6a717ecdd9676dfa29b9bb443090712efe20cbd520284d6f5c7116f4cf10875d" gracePeriod=30 Mar 01 10:01:47 crc kubenswrapper[4792]: I0301 10:01:47.335861 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a4871267-e63c-4804-a404-869a0fdbd171" containerName="ceilometer-notification-agent" containerID="cri-o://61259115634a264180b7ef973ccc70cba261d421d03c9f32f31608df7c9afe57" gracePeriod=30 Mar 01 10:01:47 crc kubenswrapper[4792]: I0301 10:01:47.335879 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a4871267-e63c-4804-a404-869a0fdbd171" containerName="sg-core" containerID="cri-o://9953bdd8b70e316fcb7e17da4271cb1872c378bfc503f331c38bf892ec73dc20" gracePeriod=30 Mar 01 10:01:47 crc kubenswrapper[4792]: I0301 10:01:47.920138 4792 generic.go:334] "Generic (PLEG): container finished" podID="a4871267-e63c-4804-a404-869a0fdbd171" containerID="6a717ecdd9676dfa29b9bb443090712efe20cbd520284d6f5c7116f4cf10875d" exitCode=0 Mar 01 10:01:47 crc kubenswrapper[4792]: I0301 10:01:47.920463 4792 generic.go:334] "Generic (PLEG): container finished" podID="a4871267-e63c-4804-a404-869a0fdbd171" containerID="9953bdd8b70e316fcb7e17da4271cb1872c378bfc503f331c38bf892ec73dc20" exitCode=2 Mar 01 10:01:47 crc kubenswrapper[4792]: I0301 10:01:47.920474 4792 generic.go:334] "Generic (PLEG): container finished" podID="a4871267-e63c-4804-a404-869a0fdbd171" containerID="3d636a3af1c36dd5782bf30fc9b5448d960eda4cadc3be15e2341f14ec6c7b14" exitCode=0 Mar 01 10:01:47 crc kubenswrapper[4792]: I0301 10:01:47.920172 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4871267-e63c-4804-a404-869a0fdbd171","Type":"ContainerDied","Data":"6a717ecdd9676dfa29b9bb443090712efe20cbd520284d6f5c7116f4cf10875d"} Mar 01 10:01:47 crc kubenswrapper[4792]: I0301 10:01:47.920506 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4871267-e63c-4804-a404-869a0fdbd171","Type":"ContainerDied","Data":"9953bdd8b70e316fcb7e17da4271cb1872c378bfc503f331c38bf892ec73dc20"} Mar 01 10:01:47 crc kubenswrapper[4792]: I0301 10:01:47.920517 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4871267-e63c-4804-a404-869a0fdbd171","Type":"ContainerDied","Data":"3d636a3af1c36dd5782bf30fc9b5448d960eda4cadc3be15e2341f14ec6c7b14"} Mar 01 10:01:47 crc kubenswrapper[4792]: I0301 10:01:47.960767 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:01:47 crc kubenswrapper[4792]: I0301 10:01:47.969344 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.412325 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:01:48 crc kubenswrapper[4792]: E0301 10:01:48.413353 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.832387 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.889772 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4871267-e63c-4804-a404-869a0fdbd171-log-httpd\") pod \"a4871267-e63c-4804-a404-869a0fdbd171\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.889831 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4871267-e63c-4804-a404-869a0fdbd171-run-httpd\") pod \"a4871267-e63c-4804-a404-869a0fdbd171\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.889967 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-sg-core-conf-yaml\") pod \"a4871267-e63c-4804-a404-869a0fdbd171\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.889985 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cl6tg\" (UniqueName: \"kubernetes.io/projected/a4871267-e63c-4804-a404-869a0fdbd171-kube-api-access-cl6tg\") pod \"a4871267-e63c-4804-a404-869a0fdbd171\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.890031 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-combined-ca-bundle\") pod \"a4871267-e63c-4804-a404-869a0fdbd171\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.890056 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-config-data\") pod \"a4871267-e63c-4804-a404-869a0fdbd171\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.890106 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-ceilometer-tls-certs\") pod \"a4871267-e63c-4804-a404-869a0fdbd171\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.890126 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-scripts\") pod \"a4871267-e63c-4804-a404-869a0fdbd171\" (UID: \"a4871267-e63c-4804-a404-869a0fdbd171\") " Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.890501 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4871267-e63c-4804-a404-869a0fdbd171-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a4871267-e63c-4804-a404-869a0fdbd171" (UID: "a4871267-e63c-4804-a404-869a0fdbd171"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.893200 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4871267-e63c-4804-a404-869a0fdbd171-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a4871267-e63c-4804-a404-869a0fdbd171" (UID: "a4871267-e63c-4804-a404-869a0fdbd171"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.895943 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4871267-e63c-4804-a404-869a0fdbd171-kube-api-access-cl6tg" (OuterVolumeSpecName: "kube-api-access-cl6tg") pod "a4871267-e63c-4804-a404-869a0fdbd171" (UID: "a4871267-e63c-4804-a404-869a0fdbd171"). InnerVolumeSpecName "kube-api-access-cl6tg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.911525 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-scripts" (OuterVolumeSpecName: "scripts") pod "a4871267-e63c-4804-a404-869a0fdbd171" (UID: "a4871267-e63c-4804-a404-869a0fdbd171"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.966128 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a4871267-e63c-4804-a404-869a0fdbd171" (UID: "a4871267-e63c-4804-a404-869a0fdbd171"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.998146 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.998169 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4871267-e63c-4804-a404-869a0fdbd171-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.998180 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4871267-e63c-4804-a404-869a0fdbd171-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.998188 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:48 crc kubenswrapper[4792]: I0301 10:01:48.998198 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cl6tg\" (UniqueName: \"kubernetes.io/projected/a4871267-e63c-4804-a404-869a0fdbd171-kube-api-access-cl6tg\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.002640 4792 generic.go:334] "Generic (PLEG): container finished" podID="a4871267-e63c-4804-a404-869a0fdbd171" containerID="61259115634a264180b7ef973ccc70cba261d421d03c9f32f31608df7c9afe57" exitCode=0 Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.002681 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4871267-e63c-4804-a404-869a0fdbd171","Type":"ContainerDied","Data":"61259115634a264180b7ef973ccc70cba261d421d03c9f32f31608df7c9afe57"} Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.002708 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4871267-e63c-4804-a404-869a0fdbd171","Type":"ContainerDied","Data":"5daa2a8c28823d6b2f08c8830868b8a9480988afbc9d6f7999eee6cf3c7e7ff9"} Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.002724 4792 scope.go:117] "RemoveContainer" containerID="6a717ecdd9676dfa29b9bb443090712efe20cbd520284d6f5c7116f4cf10875d" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.002874 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.033024 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "a4871267-e63c-4804-a404-869a0fdbd171" (UID: "a4871267-e63c-4804-a404-869a0fdbd171"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.089421 4792 scope.go:117] "RemoveContainer" containerID="9953bdd8b70e316fcb7e17da4271cb1872c378bfc503f331c38bf892ec73dc20" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.096098 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4871267-e63c-4804-a404-869a0fdbd171" (UID: "a4871267-e63c-4804-a404-869a0fdbd171"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.100486 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.100518 4792 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.122994 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-config-data" (OuterVolumeSpecName: "config-data") pod "a4871267-e63c-4804-a404-869a0fdbd171" (UID: "a4871267-e63c-4804-a404-869a0fdbd171"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.147046 4792 scope.go:117] "RemoveContainer" containerID="61259115634a264180b7ef973ccc70cba261d421d03c9f32f31608df7c9afe57" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.202179 4792 scope.go:117] "RemoveContainer" containerID="3d636a3af1c36dd5782bf30fc9b5448d960eda4cadc3be15e2341f14ec6c7b14" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.202315 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4871267-e63c-4804-a404-869a0fdbd171-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.261071 4792 scope.go:117] "RemoveContainer" containerID="6a717ecdd9676dfa29b9bb443090712efe20cbd520284d6f5c7116f4cf10875d" Mar 01 10:01:49 crc kubenswrapper[4792]: E0301 10:01:49.268039 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a717ecdd9676dfa29b9bb443090712efe20cbd520284d6f5c7116f4cf10875d\": container with ID starting with 6a717ecdd9676dfa29b9bb443090712efe20cbd520284d6f5c7116f4cf10875d not found: ID does not exist" containerID="6a717ecdd9676dfa29b9bb443090712efe20cbd520284d6f5c7116f4cf10875d" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.268079 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a717ecdd9676dfa29b9bb443090712efe20cbd520284d6f5c7116f4cf10875d"} err="failed to get container status \"6a717ecdd9676dfa29b9bb443090712efe20cbd520284d6f5c7116f4cf10875d\": rpc error: code = NotFound desc = could not find container \"6a717ecdd9676dfa29b9bb443090712efe20cbd520284d6f5c7116f4cf10875d\": container with ID starting with 6a717ecdd9676dfa29b9bb443090712efe20cbd520284d6f5c7116f4cf10875d not found: ID does not exist" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.268105 4792 scope.go:117] "RemoveContainer" containerID="9953bdd8b70e316fcb7e17da4271cb1872c378bfc503f331c38bf892ec73dc20" Mar 01 10:01:49 crc kubenswrapper[4792]: E0301 10:01:49.274017 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9953bdd8b70e316fcb7e17da4271cb1872c378bfc503f331c38bf892ec73dc20\": container with ID starting with 9953bdd8b70e316fcb7e17da4271cb1872c378bfc503f331c38bf892ec73dc20 not found: ID does not exist" containerID="9953bdd8b70e316fcb7e17da4271cb1872c378bfc503f331c38bf892ec73dc20" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.274052 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9953bdd8b70e316fcb7e17da4271cb1872c378bfc503f331c38bf892ec73dc20"} err="failed to get container status \"9953bdd8b70e316fcb7e17da4271cb1872c378bfc503f331c38bf892ec73dc20\": rpc error: code = NotFound desc = could not find container \"9953bdd8b70e316fcb7e17da4271cb1872c378bfc503f331c38bf892ec73dc20\": container with ID starting with 9953bdd8b70e316fcb7e17da4271cb1872c378bfc503f331c38bf892ec73dc20 not found: ID does not exist" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.274072 4792 scope.go:117] "RemoveContainer" containerID="61259115634a264180b7ef973ccc70cba261d421d03c9f32f31608df7c9afe57" Mar 01 10:01:49 crc kubenswrapper[4792]: E0301 10:01:49.276188 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61259115634a264180b7ef973ccc70cba261d421d03c9f32f31608df7c9afe57\": container with ID starting with 61259115634a264180b7ef973ccc70cba261d421d03c9f32f31608df7c9afe57 not found: ID does not exist" containerID="61259115634a264180b7ef973ccc70cba261d421d03c9f32f31608df7c9afe57" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.276224 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61259115634a264180b7ef973ccc70cba261d421d03c9f32f31608df7c9afe57"} err="failed to get container status \"61259115634a264180b7ef973ccc70cba261d421d03c9f32f31608df7c9afe57\": rpc error: code = NotFound desc = could not find container \"61259115634a264180b7ef973ccc70cba261d421d03c9f32f31608df7c9afe57\": container with ID starting with 61259115634a264180b7ef973ccc70cba261d421d03c9f32f31608df7c9afe57 not found: ID does not exist" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.276244 4792 scope.go:117] "RemoveContainer" containerID="3d636a3af1c36dd5782bf30fc9b5448d960eda4cadc3be15e2341f14ec6c7b14" Mar 01 10:01:49 crc kubenswrapper[4792]: E0301 10:01:49.281000 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d636a3af1c36dd5782bf30fc9b5448d960eda4cadc3be15e2341f14ec6c7b14\": container with ID starting with 3d636a3af1c36dd5782bf30fc9b5448d960eda4cadc3be15e2341f14ec6c7b14 not found: ID does not exist" containerID="3d636a3af1c36dd5782bf30fc9b5448d960eda4cadc3be15e2341f14ec6c7b14" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.281027 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d636a3af1c36dd5782bf30fc9b5448d960eda4cadc3be15e2341f14ec6c7b14"} err="failed to get container status \"3d636a3af1c36dd5782bf30fc9b5448d960eda4cadc3be15e2341f14ec6c7b14\": rpc error: code = NotFound desc = could not find container \"3d636a3af1c36dd5782bf30fc9b5448d960eda4cadc3be15e2341f14ec6c7b14\": container with ID starting with 3d636a3af1c36dd5782bf30fc9b5448d960eda4cadc3be15e2341f14ec6c7b14 not found: ID does not exist" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.362363 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.375691 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.439763 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4871267-e63c-4804-a404-869a0fdbd171" path="/var/lib/kubelet/pods/a4871267-e63c-4804-a404-869a0fdbd171/volumes" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.440561 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 01 10:01:49 crc kubenswrapper[4792]: E0301 10:01:49.441067 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4871267-e63c-4804-a404-869a0fdbd171" containerName="ceilometer-central-agent" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.441079 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4871267-e63c-4804-a404-869a0fdbd171" containerName="ceilometer-central-agent" Mar 01 10:01:49 crc kubenswrapper[4792]: E0301 10:01:49.441093 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4871267-e63c-4804-a404-869a0fdbd171" containerName="ceilometer-notification-agent" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.441099 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4871267-e63c-4804-a404-869a0fdbd171" containerName="ceilometer-notification-agent" Mar 01 10:01:49 crc kubenswrapper[4792]: E0301 10:01:49.441114 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4871267-e63c-4804-a404-869a0fdbd171" containerName="sg-core" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.441119 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4871267-e63c-4804-a404-869a0fdbd171" containerName="sg-core" Mar 01 10:01:49 crc kubenswrapper[4792]: E0301 10:01:49.441130 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3a408d8-0510-4867-8517-e609d614a5d2" containerName="dnsmasq-dns" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.441136 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3a408d8-0510-4867-8517-e609d614a5d2" containerName="dnsmasq-dns" Mar 01 10:01:49 crc kubenswrapper[4792]: E0301 10:01:49.441146 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3a408d8-0510-4867-8517-e609d614a5d2" containerName="init" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.441152 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3a408d8-0510-4867-8517-e609d614a5d2" containerName="init" Mar 01 10:01:49 crc kubenswrapper[4792]: E0301 10:01:49.441167 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4871267-e63c-4804-a404-869a0fdbd171" containerName="proxy-httpd" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.441172 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4871267-e63c-4804-a404-869a0fdbd171" containerName="proxy-httpd" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.441348 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4871267-e63c-4804-a404-869a0fdbd171" containerName="ceilometer-central-agent" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.441369 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3a408d8-0510-4867-8517-e609d614a5d2" containerName="dnsmasq-dns" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.441377 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4871267-e63c-4804-a404-869a0fdbd171" containerName="sg-core" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.441388 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4871267-e63c-4804-a404-869a0fdbd171" containerName="ceilometer-notification-agent" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.441398 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4871267-e63c-4804-a404-869a0fdbd171" containerName="proxy-httpd" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.443191 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.443280 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.446973 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.447175 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.451258 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.521742 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.521888 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d145fe82-e716-418e-990b-c139edc82fa5-log-httpd\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.521939 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d145fe82-e716-418e-990b-c139edc82fa5-run-httpd\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.521965 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvckx\" (UniqueName: \"kubernetes.io/projected/d145fe82-e716-418e-990b-c139edc82fa5-kube-api-access-wvckx\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.521996 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.522018 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-config-data\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.522149 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-scripts\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.522173 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.623489 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-scripts\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.623530 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.623570 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.623642 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d145fe82-e716-418e-990b-c139edc82fa5-log-httpd\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.623668 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d145fe82-e716-418e-990b-c139edc82fa5-run-httpd\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.623688 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvckx\" (UniqueName: \"kubernetes.io/projected/d145fe82-e716-418e-990b-c139edc82fa5-kube-api-access-wvckx\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.623709 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.623726 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-config-data\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.624506 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d145fe82-e716-418e-990b-c139edc82fa5-log-httpd\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.626433 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d145fe82-e716-418e-990b-c139edc82fa5-run-httpd\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.628873 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-scripts\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.632855 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.636978 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-config-data\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.637631 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.638019 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.651370 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvckx\" (UniqueName: \"kubernetes.io/projected/d145fe82-e716-418e-990b-c139edc82fa5-kube-api-access-wvckx\") pod \"ceilometer-0\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " pod="openstack/ceilometer-0" Mar 01 10:01:49 crc kubenswrapper[4792]: I0301 10:01:49.770417 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 10:01:50 crc kubenswrapper[4792]: I0301 10:01:50.047825 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"2e20671b-de40-40a8-8237-2bd9940b9af5","Type":"ContainerStarted","Data":"f8623324113a06ed0f840a7abbd929eaf57c541d1aae8f4b05f1bf6b431b24fe"} Mar 01 10:01:50 crc kubenswrapper[4792]: I0301 10:01:50.414289 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 10:01:50 crc kubenswrapper[4792]: I0301 10:01:50.713744 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:01:51 crc kubenswrapper[4792]: I0301 10:01:51.058965 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"2e20671b-de40-40a8-8237-2bd9940b9af5","Type":"ContainerStarted","Data":"e9350b459945620c5d55700a03e5aa49740d1a4854b26be997bf075b28191bfe"} Mar 01 10:01:51 crc kubenswrapper[4792]: I0301 10:01:51.060749 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d145fe82-e716-418e-990b-c139edc82fa5","Type":"ContainerStarted","Data":"ce25965b24241e019e5342dcd3aa81c21c3982896b8fa1aed06c530d8002ff34"} Mar 01 10:01:51 crc kubenswrapper[4792]: I0301 10:01:51.091246 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=5.19442946 podStartE2EDuration="20.091226474s" podCreationTimestamp="2026-03-01 10:01:31 +0000 UTC" firstStartedPulling="2026-03-01 10:01:33.546805901 +0000 UTC m=+3222.788685098" lastFinishedPulling="2026-03-01 10:01:48.443602915 +0000 UTC m=+3237.685482112" observedRunningTime="2026-03-01 10:01:51.088712211 +0000 UTC m=+3240.330591428" watchObservedRunningTime="2026-03-01 10:01:51.091226474 +0000 UTC m=+3240.333105671" Mar 01 10:01:51 crc kubenswrapper[4792]: I0301 10:01:51.372610 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-79f8cb6d9d-xg7h5" Mar 01 10:01:51 crc kubenswrapper[4792]: I0301 10:01:51.520405 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-689c76c966-7mbkl"] Mar 01 10:01:51 crc kubenswrapper[4792]: I0301 10:01:51.520701 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-689c76c966-7mbkl" podUID="67e060ef-1cc6-4b39-8622-bbcc183bdda0" containerName="horizon-log" containerID="cri-o://7eefa3b27a0172eca37fa354a49dffdfcea57a296cf27db9b99db7f3391b92ce" gracePeriod=30 Mar 01 10:01:51 crc kubenswrapper[4792]: I0301 10:01:51.520978 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-689c76c966-7mbkl" podUID="67e060ef-1cc6-4b39-8622-bbcc183bdda0" containerName="horizon" containerID="cri-o://809c1ff3a35d23863d9e774c5c257d32ce27f54609e7871a540c404967d6246b" gracePeriod=30 Mar 01 10:01:52 crc kubenswrapper[4792]: I0301 10:01:52.070245 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d145fe82-e716-418e-990b-c139edc82fa5","Type":"ContainerStarted","Data":"397b59d785e3917d16a895a2d5a3a1112dd5a93072735c2c87d76a8c570eace8"} Mar 01 10:01:52 crc kubenswrapper[4792]: I0301 10:01:52.070586 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d145fe82-e716-418e-990b-c139edc82fa5","Type":"ContainerStarted","Data":"60637fe04075519a9c42d81a400a742249a0bd0cb88a9744d6cb782e2e26c189"} Mar 01 10:01:52 crc kubenswrapper[4792]: I0301 10:01:52.120449 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Mar 01 10:01:53 crc kubenswrapper[4792]: I0301 10:01:53.081164 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d145fe82-e716-418e-990b-c139edc82fa5","Type":"ContainerStarted","Data":"1ae470c9722b44df4620ea0f02bb04d1aaff8e37f4b00ed39ee445412fbd0e34"} Mar 01 10:01:54 crc kubenswrapper[4792]: I0301 10:01:54.496274 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Mar 01 10:01:54 crc kubenswrapper[4792]: I0301 10:01:54.578993 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Mar 01 10:01:54 crc kubenswrapper[4792]: I0301 10:01:54.709605 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-689c76c966-7mbkl" podUID="67e060ef-1cc6-4b39-8622-bbcc183bdda0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.12:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:57444->10.217.1.12:8443: read: connection reset by peer" Mar 01 10:01:55 crc kubenswrapper[4792]: I0301 10:01:55.098969 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d145fe82-e716-418e-990b-c139edc82fa5","Type":"ContainerStarted","Data":"db37dac072d6d2caae6fc34c918a3cef14b10a312e58745dfb92fa4708db31e6"} Mar 01 10:01:55 crc kubenswrapper[4792]: I0301 10:01:55.099570 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 01 10:01:55 crc kubenswrapper[4792]: I0301 10:01:55.101256 4792 generic.go:334] "Generic (PLEG): container finished" podID="67e060ef-1cc6-4b39-8622-bbcc183bdda0" containerID="809c1ff3a35d23863d9e774c5c257d32ce27f54609e7871a540c404967d6246b" exitCode=0 Mar 01 10:01:55 crc kubenswrapper[4792]: I0301 10:01:55.101307 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-689c76c966-7mbkl" event={"ID":"67e060ef-1cc6-4b39-8622-bbcc183bdda0","Type":"ContainerDied","Data":"809c1ff3a35d23863d9e774c5c257d32ce27f54609e7871a540c404967d6246b"} Mar 01 10:01:55 crc kubenswrapper[4792]: I0301 10:01:55.101453 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38" containerName="manila-scheduler" containerID="cri-o://dde55c4e7a9e0e028ff7df99a2c399eb3840b164a780338d688539c0a3c4421a" gracePeriod=30 Mar 01 10:01:55 crc kubenswrapper[4792]: I0301 10:01:55.101479 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38" containerName="probe" containerID="cri-o://06366ec6e532eeea2e6a301ab0ca2d1e855bb93e6c5118bc9e3b934abde5eb98" gracePeriod=30 Mar 01 10:01:55 crc kubenswrapper[4792]: I0301 10:01:55.126047 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.037003952 podStartE2EDuration="6.126031466s" podCreationTimestamp="2026-03-01 10:01:49 +0000 UTC" firstStartedPulling="2026-03-01 10:01:50.437161142 +0000 UTC m=+3239.679040339" lastFinishedPulling="2026-03-01 10:01:54.526188666 +0000 UTC m=+3243.768067853" observedRunningTime="2026-03-01 10:01:55.12337733 +0000 UTC m=+3244.365256527" watchObservedRunningTime="2026-03-01 10:01:55.126031466 +0000 UTC m=+3244.367910663" Mar 01 10:01:56 crc kubenswrapper[4792]: I0301 10:01:56.112879 4792 generic.go:334] "Generic (PLEG): container finished" podID="624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38" containerID="06366ec6e532eeea2e6a301ab0ca2d1e855bb93e6c5118bc9e3b934abde5eb98" exitCode=0 Mar 01 10:01:56 crc kubenswrapper[4792]: I0301 10:01:56.112952 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38","Type":"ContainerDied","Data":"06366ec6e532eeea2e6a301ab0ca2d1e855bb93e6c5118bc9e3b934abde5eb98"} Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.124622 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.134255 4792 generic.go:334] "Generic (PLEG): container finished" podID="624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38" containerID="dde55c4e7a9e0e028ff7df99a2c399eb3840b164a780338d688539c0a3c4421a" exitCode=0 Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.134298 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.134299 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38","Type":"ContainerDied","Data":"dde55c4e7a9e0e028ff7df99a2c399eb3840b164a780338d688539c0a3c4421a"} Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.134327 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38","Type":"ContainerDied","Data":"bd049fc1e63654f23ff767894cd0f3d2ee5d142592fec5855ea0f40d653d992d"} Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.134350 4792 scope.go:117] "RemoveContainer" containerID="06366ec6e532eeea2e6a301ab0ca2d1e855bb93e6c5118bc9e3b934abde5eb98" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.197120 4792 scope.go:117] "RemoveContainer" containerID="dde55c4e7a9e0e028ff7df99a2c399eb3840b164a780338d688539c0a3c4421a" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.227769 4792 scope.go:117] "RemoveContainer" containerID="06366ec6e532eeea2e6a301ab0ca2d1e855bb93e6c5118bc9e3b934abde5eb98" Mar 01 10:01:58 crc kubenswrapper[4792]: E0301 10:01:58.228660 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06366ec6e532eeea2e6a301ab0ca2d1e855bb93e6c5118bc9e3b934abde5eb98\": container with ID starting with 06366ec6e532eeea2e6a301ab0ca2d1e855bb93e6c5118bc9e3b934abde5eb98 not found: ID does not exist" containerID="06366ec6e532eeea2e6a301ab0ca2d1e855bb93e6c5118bc9e3b934abde5eb98" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.228736 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06366ec6e532eeea2e6a301ab0ca2d1e855bb93e6c5118bc9e3b934abde5eb98"} err="failed to get container status \"06366ec6e532eeea2e6a301ab0ca2d1e855bb93e6c5118bc9e3b934abde5eb98\": rpc error: code = NotFound desc = could not find container \"06366ec6e532eeea2e6a301ab0ca2d1e855bb93e6c5118bc9e3b934abde5eb98\": container with ID starting with 06366ec6e532eeea2e6a301ab0ca2d1e855bb93e6c5118bc9e3b934abde5eb98 not found: ID does not exist" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.228770 4792 scope.go:117] "RemoveContainer" containerID="dde55c4e7a9e0e028ff7df99a2c399eb3840b164a780338d688539c0a3c4421a" Mar 01 10:01:58 crc kubenswrapper[4792]: E0301 10:01:58.232213 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dde55c4e7a9e0e028ff7df99a2c399eb3840b164a780338d688539c0a3c4421a\": container with ID starting with dde55c4e7a9e0e028ff7df99a2c399eb3840b164a780338d688539c0a3c4421a not found: ID does not exist" containerID="dde55c4e7a9e0e028ff7df99a2c399eb3840b164a780338d688539c0a3c4421a" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.232260 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dde55c4e7a9e0e028ff7df99a2c399eb3840b164a780338d688539c0a3c4421a"} err="failed to get container status \"dde55c4e7a9e0e028ff7df99a2c399eb3840b164a780338d688539c0a3c4421a\": rpc error: code = NotFound desc = could not find container \"dde55c4e7a9e0e028ff7df99a2c399eb3840b164a780338d688539c0a3c4421a\": container with ID starting with dde55c4e7a9e0e028ff7df99a2c399eb3840b164a780338d688539c0a3c4421a not found: ID does not exist" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.242538 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-combined-ca-bundle\") pod \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.242781 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-config-data\") pod \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.242870 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-scripts\") pod \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.243117 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbl2x\" (UniqueName: \"kubernetes.io/projected/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-kube-api-access-qbl2x\") pod \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.243261 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-config-data-custom\") pod \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.243420 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-etc-machine-id\") pod \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\" (UID: \"624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38\") " Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.244419 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38" (UID: "624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.251859 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-scripts" (OuterVolumeSpecName: "scripts") pod "624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38" (UID: "624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.253184 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38" (UID: "624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.268271 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-kube-api-access-qbl2x" (OuterVolumeSpecName: "kube-api-access-qbl2x") pod "624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38" (UID: "624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38"). InnerVolumeSpecName "kube-api-access-qbl2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.308259 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38" (UID: "624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.346829 4792 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.346867 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.346881 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.346890 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbl2x\" (UniqueName: \"kubernetes.io/projected/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-kube-api-access-qbl2x\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.346901 4792 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.365964 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-config-data" (OuterVolumeSpecName: "config-data") pod "624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38" (UID: "624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.449250 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.466299 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.474917 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.490367 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Mar 01 10:01:58 crc kubenswrapper[4792]: E0301 10:01:58.490715 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38" containerName="manila-scheduler" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.490730 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38" containerName="manila-scheduler" Mar 01 10:01:58 crc kubenswrapper[4792]: E0301 10:01:58.490759 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38" containerName="probe" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.490765 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38" containerName="probe" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.490953 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38" containerName="manila-scheduler" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.490974 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38" containerName="probe" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.493299 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.505407 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.508052 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.552353 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5813cf9a-1d9e-4a74-82e1-68e994c9175a-scripts\") pod \"manila-scheduler-0\" (UID: \"5813cf9a-1d9e-4a74-82e1-68e994c9175a\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.552397 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5813cf9a-1d9e-4a74-82e1-68e994c9175a-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"5813cf9a-1d9e-4a74-82e1-68e994c9175a\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.552514 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5813cf9a-1d9e-4a74-82e1-68e994c9175a-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"5813cf9a-1d9e-4a74-82e1-68e994c9175a\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.552569 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5813cf9a-1d9e-4a74-82e1-68e994c9175a-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"5813cf9a-1d9e-4a74-82e1-68e994c9175a\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.552603 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgsgj\" (UniqueName: \"kubernetes.io/projected/5813cf9a-1d9e-4a74-82e1-68e994c9175a-kube-api-access-rgsgj\") pod \"manila-scheduler-0\" (UID: \"5813cf9a-1d9e-4a74-82e1-68e994c9175a\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.552736 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5813cf9a-1d9e-4a74-82e1-68e994c9175a-config-data\") pod \"manila-scheduler-0\" (UID: \"5813cf9a-1d9e-4a74-82e1-68e994c9175a\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.655334 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5813cf9a-1d9e-4a74-82e1-68e994c9175a-config-data\") pod \"manila-scheduler-0\" (UID: \"5813cf9a-1d9e-4a74-82e1-68e994c9175a\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.655398 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5813cf9a-1d9e-4a74-82e1-68e994c9175a-scripts\") pod \"manila-scheduler-0\" (UID: \"5813cf9a-1d9e-4a74-82e1-68e994c9175a\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.655417 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5813cf9a-1d9e-4a74-82e1-68e994c9175a-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"5813cf9a-1d9e-4a74-82e1-68e994c9175a\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.655477 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5813cf9a-1d9e-4a74-82e1-68e994c9175a-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"5813cf9a-1d9e-4a74-82e1-68e994c9175a\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.655522 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5813cf9a-1d9e-4a74-82e1-68e994c9175a-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"5813cf9a-1d9e-4a74-82e1-68e994c9175a\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.655547 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgsgj\" (UniqueName: \"kubernetes.io/projected/5813cf9a-1d9e-4a74-82e1-68e994c9175a-kube-api-access-rgsgj\") pod \"manila-scheduler-0\" (UID: \"5813cf9a-1d9e-4a74-82e1-68e994c9175a\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.656870 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5813cf9a-1d9e-4a74-82e1-68e994c9175a-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"5813cf9a-1d9e-4a74-82e1-68e994c9175a\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.668778 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5813cf9a-1d9e-4a74-82e1-68e994c9175a-config-data\") pod \"manila-scheduler-0\" (UID: \"5813cf9a-1d9e-4a74-82e1-68e994c9175a\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.669202 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5813cf9a-1d9e-4a74-82e1-68e994c9175a-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"5813cf9a-1d9e-4a74-82e1-68e994c9175a\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.670648 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5813cf9a-1d9e-4a74-82e1-68e994c9175a-scripts\") pod \"manila-scheduler-0\" (UID: \"5813cf9a-1d9e-4a74-82e1-68e994c9175a\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.672458 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5813cf9a-1d9e-4a74-82e1-68e994c9175a-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"5813cf9a-1d9e-4a74-82e1-68e994c9175a\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.684968 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgsgj\" (UniqueName: \"kubernetes.io/projected/5813cf9a-1d9e-4a74-82e1-68e994c9175a-kube-api-access-rgsgj\") pod \"manila-scheduler-0\" (UID: \"5813cf9a-1d9e-4a74-82e1-68e994c9175a\") " pod="openstack/manila-scheduler-0" Mar 01 10:01:58 crc kubenswrapper[4792]: I0301 10:01:58.855460 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 01 10:01:59 crc kubenswrapper[4792]: I0301 10:01:59.355272 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 01 10:01:59 crc kubenswrapper[4792]: I0301 10:01:59.434171 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38" path="/var/lib/kubelet/pods/624fc6ea-e4bd-4f79-b2b4-5c50c94bcd38/volumes" Mar 01 10:02:00 crc kubenswrapper[4792]: I0301 10:02:00.183689 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539322-cwkrx"] Mar 01 10:02:00 crc kubenswrapper[4792]: I0301 10:02:00.185214 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539322-cwkrx" Mar 01 10:02:00 crc kubenswrapper[4792]: I0301 10:02:00.190300 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:02:00 crc kubenswrapper[4792]: I0301 10:02:00.190453 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:02:00 crc kubenswrapper[4792]: I0301 10:02:00.190566 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:02:00 crc kubenswrapper[4792]: I0301 10:02:00.221152 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"5813cf9a-1d9e-4a74-82e1-68e994c9175a","Type":"ContainerStarted","Data":"d9817d4083c6146200ac1a8ff5a62c565ddee025f2fa812a7e0633271e866126"} Mar 01 10:02:00 crc kubenswrapper[4792]: I0301 10:02:00.221380 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"5813cf9a-1d9e-4a74-82e1-68e994c9175a","Type":"ContainerStarted","Data":"569ed6ced776f8e2e2dea6fb2be8cc5e458809032eb5e68566854ce47fc18ffe"} Mar 01 10:02:00 crc kubenswrapper[4792]: I0301 10:02:00.224964 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539322-cwkrx"] Mar 01 10:02:00 crc kubenswrapper[4792]: I0301 10:02:00.244312 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xjg6\" (UniqueName: \"kubernetes.io/projected/d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b-kube-api-access-7xjg6\") pod \"auto-csr-approver-29539322-cwkrx\" (UID: \"d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b\") " pod="openshift-infra/auto-csr-approver-29539322-cwkrx" Mar 01 10:02:00 crc kubenswrapper[4792]: I0301 10:02:00.349185 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xjg6\" (UniqueName: \"kubernetes.io/projected/d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b-kube-api-access-7xjg6\") pod \"auto-csr-approver-29539322-cwkrx\" (UID: \"d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b\") " pod="openshift-infra/auto-csr-approver-29539322-cwkrx" Mar 01 10:02:00 crc kubenswrapper[4792]: I0301 10:02:00.372759 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xjg6\" (UniqueName: \"kubernetes.io/projected/d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b-kube-api-access-7xjg6\") pod \"auto-csr-approver-29539322-cwkrx\" (UID: \"d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b\") " pod="openshift-infra/auto-csr-approver-29539322-cwkrx" Mar 01 10:02:00 crc kubenswrapper[4792]: I0301 10:02:00.522876 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539322-cwkrx" Mar 01 10:02:01 crc kubenswrapper[4792]: I0301 10:02:01.123634 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539322-cwkrx"] Mar 01 10:02:01 crc kubenswrapper[4792]: I0301 10:02:01.230375 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539322-cwkrx" event={"ID":"d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b","Type":"ContainerStarted","Data":"5a7d9c9a77b56bc28c264fe19e619434f70526367aaa6c6a9caab4e4dd2ae71c"} Mar 01 10:02:01 crc kubenswrapper[4792]: I0301 10:02:01.241374 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"5813cf9a-1d9e-4a74-82e1-68e994c9175a","Type":"ContainerStarted","Data":"d2093085d488f06da18044dc0dee3114661762c3b456b0a7fcc6c58eabb671c3"} Mar 01 10:02:01 crc kubenswrapper[4792]: I0301 10:02:01.262309 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.2622684140000002 podStartE2EDuration="3.262268414s" podCreationTimestamp="2026-03-01 10:01:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 10:02:01.259057014 +0000 UTC m=+3250.500936231" watchObservedRunningTime="2026-03-01 10:02:01.262268414 +0000 UTC m=+3250.504147621" Mar 01 10:02:01 crc kubenswrapper[4792]: I0301 10:02:01.418370 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:02:01 crc kubenswrapper[4792]: E0301 10:02:01.418970 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:02:01 crc kubenswrapper[4792]: I0301 10:02:01.604584 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Mar 01 10:02:02 crc kubenswrapper[4792]: I0301 10:02:02.250259 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539322-cwkrx" event={"ID":"d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b","Type":"ContainerStarted","Data":"ec82ee72087ccca998ba33c4f0000c5036f1693b6f0bbb69a46ea60c4b7efd7e"} Mar 01 10:02:02 crc kubenswrapper[4792]: I0301 10:02:02.270263 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29539322-cwkrx" podStartSLOduration=1.480602566 podStartE2EDuration="2.270241657s" podCreationTimestamp="2026-03-01 10:02:00 +0000 UTC" firstStartedPulling="2026-03-01 10:02:01.146617982 +0000 UTC m=+3250.388497179" lastFinishedPulling="2026-03-01 10:02:01.936257073 +0000 UTC m=+3251.178136270" observedRunningTime="2026-03-01 10:02:02.262970806 +0000 UTC m=+3251.504850013" watchObservedRunningTime="2026-03-01 10:02:02.270241657 +0000 UTC m=+3251.512120854" Mar 01 10:02:03 crc kubenswrapper[4792]: I0301 10:02:03.611420 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 10:02:03 crc kubenswrapper[4792]: I0301 10:02:03.612086 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d145fe82-e716-418e-990b-c139edc82fa5" containerName="ceilometer-central-agent" containerID="cri-o://60637fe04075519a9c42d81a400a742249a0bd0cb88a9744d6cb782e2e26c189" gracePeriod=30 Mar 01 10:02:03 crc kubenswrapper[4792]: I0301 10:02:03.612656 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d145fe82-e716-418e-990b-c139edc82fa5" containerName="proxy-httpd" containerID="cri-o://db37dac072d6d2caae6fc34c918a3cef14b10a312e58745dfb92fa4708db31e6" gracePeriod=30 Mar 01 10:02:03 crc kubenswrapper[4792]: I0301 10:02:03.612711 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d145fe82-e716-418e-990b-c139edc82fa5" containerName="sg-core" containerID="cri-o://1ae470c9722b44df4620ea0f02bb04d1aaff8e37f4b00ed39ee445412fbd0e34" gracePeriod=30 Mar 01 10:02:03 crc kubenswrapper[4792]: I0301 10:02:03.612750 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d145fe82-e716-418e-990b-c139edc82fa5" containerName="ceilometer-notification-agent" containerID="cri-o://397b59d785e3917d16a895a2d5a3a1112dd5a93072735c2c87d76a8c570eace8" gracePeriod=30 Mar 01 10:02:03 crc kubenswrapper[4792]: I0301 10:02:03.893355 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Mar 01 10:02:03 crc kubenswrapper[4792]: I0301 10:02:03.948555 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Mar 01 10:02:04 crc kubenswrapper[4792]: I0301 10:02:04.267680 4792 generic.go:334] "Generic (PLEG): container finished" podID="d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b" containerID="ec82ee72087ccca998ba33c4f0000c5036f1693b6f0bbb69a46ea60c4b7efd7e" exitCode=0 Mar 01 10:02:04 crc kubenswrapper[4792]: I0301 10:02:04.267731 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539322-cwkrx" event={"ID":"d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b","Type":"ContainerDied","Data":"ec82ee72087ccca998ba33c4f0000c5036f1693b6f0bbb69a46ea60c4b7efd7e"} Mar 01 10:02:04 crc kubenswrapper[4792]: I0301 10:02:04.271370 4792 generic.go:334] "Generic (PLEG): container finished" podID="d145fe82-e716-418e-990b-c139edc82fa5" containerID="db37dac072d6d2caae6fc34c918a3cef14b10a312e58745dfb92fa4708db31e6" exitCode=0 Mar 01 10:02:04 crc kubenswrapper[4792]: I0301 10:02:04.271397 4792 generic.go:334] "Generic (PLEG): container finished" podID="d145fe82-e716-418e-990b-c139edc82fa5" containerID="1ae470c9722b44df4620ea0f02bb04d1aaff8e37f4b00ed39ee445412fbd0e34" exitCode=2 Mar 01 10:02:04 crc kubenswrapper[4792]: I0301 10:02:04.271410 4792 generic.go:334] "Generic (PLEG): container finished" podID="d145fe82-e716-418e-990b-c139edc82fa5" containerID="60637fe04075519a9c42d81a400a742249a0bd0cb88a9744d6cb782e2e26c189" exitCode=0 Mar 01 10:02:04 crc kubenswrapper[4792]: I0301 10:02:04.271858 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d145fe82-e716-418e-990b-c139edc82fa5","Type":"ContainerDied","Data":"db37dac072d6d2caae6fc34c918a3cef14b10a312e58745dfb92fa4708db31e6"} Mar 01 10:02:04 crc kubenswrapper[4792]: I0301 10:02:04.271935 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d145fe82-e716-418e-990b-c139edc82fa5","Type":"ContainerDied","Data":"1ae470c9722b44df4620ea0f02bb04d1aaff8e37f4b00ed39ee445412fbd0e34"} Mar 01 10:02:04 crc kubenswrapper[4792]: I0301 10:02:04.271948 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d145fe82-e716-418e-990b-c139edc82fa5","Type":"ContainerDied","Data":"60637fe04075519a9c42d81a400a742249a0bd0cb88a9744d6cb782e2e26c189"} Mar 01 10:02:04 crc kubenswrapper[4792]: I0301 10:02:04.272225 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="2e20671b-de40-40a8-8237-2bd9940b9af5" containerName="manila-share" containerID="cri-o://f8623324113a06ed0f840a7abbd929eaf57c541d1aae8f4b05f1bf6b431b24fe" gracePeriod=30 Mar 01 10:02:04 crc kubenswrapper[4792]: I0301 10:02:04.272394 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="2e20671b-de40-40a8-8237-2bd9940b9af5" containerName="probe" containerID="cri-o://e9350b459945620c5d55700a03e5aa49740d1a4854b26be997bf075b28191bfe" gracePeriod=30 Mar 01 10:02:04 crc kubenswrapper[4792]: I0301 10:02:04.321927 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-689c76c966-7mbkl" podUID="67e060ef-1cc6-4b39-8622-bbcc183bdda0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.12:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.12:8443: connect: connection refused" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.221380 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.256217 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-ceilometer-tls-certs\") pod \"d145fe82-e716-418e-990b-c139edc82fa5\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.256257 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-combined-ca-bundle\") pod \"d145fe82-e716-418e-990b-c139edc82fa5\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.256280 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-scripts\") pod \"d145fe82-e716-418e-990b-c139edc82fa5\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.256513 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-config-data\") pod \"d145fe82-e716-418e-990b-c139edc82fa5\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.256568 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d145fe82-e716-418e-990b-c139edc82fa5-run-httpd\") pod \"d145fe82-e716-418e-990b-c139edc82fa5\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.256586 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d145fe82-e716-418e-990b-c139edc82fa5-log-httpd\") pod \"d145fe82-e716-418e-990b-c139edc82fa5\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.256641 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvckx\" (UniqueName: \"kubernetes.io/projected/d145fe82-e716-418e-990b-c139edc82fa5-kube-api-access-wvckx\") pod \"d145fe82-e716-418e-990b-c139edc82fa5\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.256702 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-sg-core-conf-yaml\") pod \"d145fe82-e716-418e-990b-c139edc82fa5\" (UID: \"d145fe82-e716-418e-990b-c139edc82fa5\") " Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.261697 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d145fe82-e716-418e-990b-c139edc82fa5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d145fe82-e716-418e-990b-c139edc82fa5" (UID: "d145fe82-e716-418e-990b-c139edc82fa5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.262014 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d145fe82-e716-418e-990b-c139edc82fa5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d145fe82-e716-418e-990b-c139edc82fa5" (UID: "d145fe82-e716-418e-990b-c139edc82fa5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.268930 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-scripts" (OuterVolumeSpecName: "scripts") pod "d145fe82-e716-418e-990b-c139edc82fa5" (UID: "d145fe82-e716-418e-990b-c139edc82fa5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.278405 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d145fe82-e716-418e-990b-c139edc82fa5-kube-api-access-wvckx" (OuterVolumeSpecName: "kube-api-access-wvckx") pod "d145fe82-e716-418e-990b-c139edc82fa5" (UID: "d145fe82-e716-418e-990b-c139edc82fa5"). InnerVolumeSpecName "kube-api-access-wvckx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.285064 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.285008 4792 generic.go:334] "Generic (PLEG): container finished" podID="d145fe82-e716-418e-990b-c139edc82fa5" containerID="397b59d785e3917d16a895a2d5a3a1112dd5a93072735c2c87d76a8c570eace8" exitCode=0 Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.285071 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d145fe82-e716-418e-990b-c139edc82fa5","Type":"ContainerDied","Data":"397b59d785e3917d16a895a2d5a3a1112dd5a93072735c2c87d76a8c570eace8"} Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.285779 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d145fe82-e716-418e-990b-c139edc82fa5","Type":"ContainerDied","Data":"ce25965b24241e019e5342dcd3aa81c21c3982896b8fa1aed06c530d8002ff34"} Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.285871 4792 scope.go:117] "RemoveContainer" containerID="db37dac072d6d2caae6fc34c918a3cef14b10a312e58745dfb92fa4708db31e6" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.288937 4792 generic.go:334] "Generic (PLEG): container finished" podID="2e20671b-de40-40a8-8237-2bd9940b9af5" containerID="e9350b459945620c5d55700a03e5aa49740d1a4854b26be997bf075b28191bfe" exitCode=0 Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.288968 4792 generic.go:334] "Generic (PLEG): container finished" podID="2e20671b-de40-40a8-8237-2bd9940b9af5" containerID="f8623324113a06ed0f840a7abbd929eaf57c541d1aae8f4b05f1bf6b431b24fe" exitCode=1 Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.289170 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"2e20671b-de40-40a8-8237-2bd9940b9af5","Type":"ContainerDied","Data":"e9350b459945620c5d55700a03e5aa49740d1a4854b26be997bf075b28191bfe"} Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.289229 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"2e20671b-de40-40a8-8237-2bd9940b9af5","Type":"ContainerDied","Data":"f8623324113a06ed0f840a7abbd929eaf57c541d1aae8f4b05f1bf6b431b24fe"} Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.289245 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"2e20671b-de40-40a8-8237-2bd9940b9af5","Type":"ContainerDied","Data":"4f751b65cd13e8f9d7bbd49d2e059dd52782b4f71c9b59600e77ee78c852e20a"} Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.289259 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f751b65cd13e8f9d7bbd49d2e059dd52782b4f71c9b59600e77ee78c852e20a" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.339895 4792 scope.go:117] "RemoveContainer" containerID="1ae470c9722b44df4620ea0f02bb04d1aaff8e37f4b00ed39ee445412fbd0e34" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.341798 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "d145fe82-e716-418e-990b-c139edc82fa5" (UID: "d145fe82-e716-418e-990b-c139edc82fa5"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.349390 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.358128 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5f4d\" (UniqueName: \"kubernetes.io/projected/2e20671b-de40-40a8-8237-2bd9940b9af5-kube-api-access-r5f4d\") pod \"2e20671b-de40-40a8-8237-2bd9940b9af5\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.358205 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-config-data-custom\") pod \"2e20671b-de40-40a8-8237-2bd9940b9af5\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.358270 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2e20671b-de40-40a8-8237-2bd9940b9af5-ceph\") pod \"2e20671b-de40-40a8-8237-2bd9940b9af5\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.358325 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-scripts\") pod \"2e20671b-de40-40a8-8237-2bd9940b9af5\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.358366 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-combined-ca-bundle\") pod \"2e20671b-de40-40a8-8237-2bd9940b9af5\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.358403 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-config-data\") pod \"2e20671b-de40-40a8-8237-2bd9940b9af5\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.358471 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2e20671b-de40-40a8-8237-2bd9940b9af5-etc-machine-id\") pod \"2e20671b-de40-40a8-8237-2bd9940b9af5\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.358533 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/2e20671b-de40-40a8-8237-2bd9940b9af5-var-lib-manila\") pod \"2e20671b-de40-40a8-8237-2bd9940b9af5\" (UID: \"2e20671b-de40-40a8-8237-2bd9940b9af5\") " Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.358956 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d145fe82-e716-418e-990b-c139edc82fa5-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.358979 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d145fe82-e716-418e-990b-c139edc82fa5-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.359022 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvckx\" (UniqueName: \"kubernetes.io/projected/d145fe82-e716-418e-990b-c139edc82fa5-kube-api-access-wvckx\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.359037 4792 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.359048 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.359125 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e20671b-de40-40a8-8237-2bd9940b9af5-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "2e20671b-de40-40a8-8237-2bd9940b9af5" (UID: "2e20671b-de40-40a8-8237-2bd9940b9af5"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.362064 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e20671b-de40-40a8-8237-2bd9940b9af5-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2e20671b-de40-40a8-8237-2bd9940b9af5" (UID: "2e20671b-de40-40a8-8237-2bd9940b9af5"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.369866 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-scripts" (OuterVolumeSpecName: "scripts") pod "2e20671b-de40-40a8-8237-2bd9940b9af5" (UID: "2e20671b-de40-40a8-8237-2bd9940b9af5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.372686 4792 scope.go:117] "RemoveContainer" containerID="397b59d785e3917d16a895a2d5a3a1112dd5a93072735c2c87d76a8c570eace8" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.374878 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e20671b-de40-40a8-8237-2bd9940b9af5-ceph" (OuterVolumeSpecName: "ceph") pod "2e20671b-de40-40a8-8237-2bd9940b9af5" (UID: "2e20671b-de40-40a8-8237-2bd9940b9af5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.379614 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e20671b-de40-40a8-8237-2bd9940b9af5-kube-api-access-r5f4d" (OuterVolumeSpecName: "kube-api-access-r5f4d") pod "2e20671b-de40-40a8-8237-2bd9940b9af5" (UID: "2e20671b-de40-40a8-8237-2bd9940b9af5"). InnerVolumeSpecName "kube-api-access-r5f4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.384168 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2e20671b-de40-40a8-8237-2bd9940b9af5" (UID: "2e20671b-de40-40a8-8237-2bd9940b9af5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.404861 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d145fe82-e716-418e-990b-c139edc82fa5" (UID: "d145fe82-e716-418e-990b-c139edc82fa5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.404964 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d145fe82-e716-418e-990b-c139edc82fa5" (UID: "d145fe82-e716-418e-990b-c139edc82fa5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.457145 4792 scope.go:117] "RemoveContainer" containerID="60637fe04075519a9c42d81a400a742249a0bd0cb88a9744d6cb782e2e26c189" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.461268 4792 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/2e20671b-de40-40a8-8237-2bd9940b9af5-var-lib-manila\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.461290 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.461302 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5f4d\" (UniqueName: \"kubernetes.io/projected/2e20671b-de40-40a8-8237-2bd9940b9af5-kube-api-access-r5f4d\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.461312 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.461320 4792 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.461329 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2e20671b-de40-40a8-8237-2bd9940b9af5-ceph\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.461338 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.461346 4792 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2e20671b-de40-40a8-8237-2bd9940b9af5-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.492978 4792 scope.go:117] "RemoveContainer" containerID="db37dac072d6d2caae6fc34c918a3cef14b10a312e58745dfb92fa4708db31e6" Mar 01 10:02:05 crc kubenswrapper[4792]: E0301 10:02:05.497465 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db37dac072d6d2caae6fc34c918a3cef14b10a312e58745dfb92fa4708db31e6\": container with ID starting with db37dac072d6d2caae6fc34c918a3cef14b10a312e58745dfb92fa4708db31e6 not found: ID does not exist" containerID="db37dac072d6d2caae6fc34c918a3cef14b10a312e58745dfb92fa4708db31e6" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.497518 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db37dac072d6d2caae6fc34c918a3cef14b10a312e58745dfb92fa4708db31e6"} err="failed to get container status \"db37dac072d6d2caae6fc34c918a3cef14b10a312e58745dfb92fa4708db31e6\": rpc error: code = NotFound desc = could not find container \"db37dac072d6d2caae6fc34c918a3cef14b10a312e58745dfb92fa4708db31e6\": container with ID starting with db37dac072d6d2caae6fc34c918a3cef14b10a312e58745dfb92fa4708db31e6 not found: ID does not exist" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.497545 4792 scope.go:117] "RemoveContainer" containerID="1ae470c9722b44df4620ea0f02bb04d1aaff8e37f4b00ed39ee445412fbd0e34" Mar 01 10:02:05 crc kubenswrapper[4792]: E0301 10:02:05.498044 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ae470c9722b44df4620ea0f02bb04d1aaff8e37f4b00ed39ee445412fbd0e34\": container with ID starting with 1ae470c9722b44df4620ea0f02bb04d1aaff8e37f4b00ed39ee445412fbd0e34 not found: ID does not exist" containerID="1ae470c9722b44df4620ea0f02bb04d1aaff8e37f4b00ed39ee445412fbd0e34" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.498081 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ae470c9722b44df4620ea0f02bb04d1aaff8e37f4b00ed39ee445412fbd0e34"} err="failed to get container status \"1ae470c9722b44df4620ea0f02bb04d1aaff8e37f4b00ed39ee445412fbd0e34\": rpc error: code = NotFound desc = could not find container \"1ae470c9722b44df4620ea0f02bb04d1aaff8e37f4b00ed39ee445412fbd0e34\": container with ID starting with 1ae470c9722b44df4620ea0f02bb04d1aaff8e37f4b00ed39ee445412fbd0e34 not found: ID does not exist" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.498108 4792 scope.go:117] "RemoveContainer" containerID="397b59d785e3917d16a895a2d5a3a1112dd5a93072735c2c87d76a8c570eace8" Mar 01 10:02:05 crc kubenswrapper[4792]: E0301 10:02:05.498881 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"397b59d785e3917d16a895a2d5a3a1112dd5a93072735c2c87d76a8c570eace8\": container with ID starting with 397b59d785e3917d16a895a2d5a3a1112dd5a93072735c2c87d76a8c570eace8 not found: ID does not exist" containerID="397b59d785e3917d16a895a2d5a3a1112dd5a93072735c2c87d76a8c570eace8" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.498935 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"397b59d785e3917d16a895a2d5a3a1112dd5a93072735c2c87d76a8c570eace8"} err="failed to get container status \"397b59d785e3917d16a895a2d5a3a1112dd5a93072735c2c87d76a8c570eace8\": rpc error: code = NotFound desc = could not find container \"397b59d785e3917d16a895a2d5a3a1112dd5a93072735c2c87d76a8c570eace8\": container with ID starting with 397b59d785e3917d16a895a2d5a3a1112dd5a93072735c2c87d76a8c570eace8 not found: ID does not exist" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.498965 4792 scope.go:117] "RemoveContainer" containerID="60637fe04075519a9c42d81a400a742249a0bd0cb88a9744d6cb782e2e26c189" Mar 01 10:02:05 crc kubenswrapper[4792]: E0301 10:02:05.499229 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60637fe04075519a9c42d81a400a742249a0bd0cb88a9744d6cb782e2e26c189\": container with ID starting with 60637fe04075519a9c42d81a400a742249a0bd0cb88a9744d6cb782e2e26c189 not found: ID does not exist" containerID="60637fe04075519a9c42d81a400a742249a0bd0cb88a9744d6cb782e2e26c189" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.499257 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60637fe04075519a9c42d81a400a742249a0bd0cb88a9744d6cb782e2e26c189"} err="failed to get container status \"60637fe04075519a9c42d81a400a742249a0bd0cb88a9744d6cb782e2e26c189\": rpc error: code = NotFound desc = could not find container \"60637fe04075519a9c42d81a400a742249a0bd0cb88a9744d6cb782e2e26c189\": container with ID starting with 60637fe04075519a9c42d81a400a742249a0bd0cb88a9744d6cb782e2e26c189 not found: ID does not exist" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.501792 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-config-data" (OuterVolumeSpecName: "config-data") pod "d145fe82-e716-418e-990b-c139edc82fa5" (UID: "d145fe82-e716-418e-990b-c139edc82fa5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.503165 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e20671b-de40-40a8-8237-2bd9940b9af5" (UID: "2e20671b-de40-40a8-8237-2bd9940b9af5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.562919 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.562949 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d145fe82-e716-418e-990b-c139edc82fa5-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.630042 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-config-data" (OuterVolumeSpecName: "config-data") pod "2e20671b-de40-40a8-8237-2bd9940b9af5" (UID: "2e20671b-de40-40a8-8237-2bd9940b9af5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.666614 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e20671b-de40-40a8-8237-2bd9940b9af5-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.708151 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.718439 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.746079 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 01 10:02:05 crc kubenswrapper[4792]: E0301 10:02:05.746516 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d145fe82-e716-418e-990b-c139edc82fa5" containerName="sg-core" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.746530 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d145fe82-e716-418e-990b-c139edc82fa5" containerName="sg-core" Mar 01 10:02:05 crc kubenswrapper[4792]: E0301 10:02:05.746545 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d145fe82-e716-418e-990b-c139edc82fa5" containerName="ceilometer-notification-agent" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.746551 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d145fe82-e716-418e-990b-c139edc82fa5" containerName="ceilometer-notification-agent" Mar 01 10:02:05 crc kubenswrapper[4792]: E0301 10:02:05.746568 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e20671b-de40-40a8-8237-2bd9940b9af5" containerName="manila-share" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.746574 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e20671b-de40-40a8-8237-2bd9940b9af5" containerName="manila-share" Mar 01 10:02:05 crc kubenswrapper[4792]: E0301 10:02:05.746599 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e20671b-de40-40a8-8237-2bd9940b9af5" containerName="probe" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.746604 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e20671b-de40-40a8-8237-2bd9940b9af5" containerName="probe" Mar 01 10:02:05 crc kubenswrapper[4792]: E0301 10:02:05.746612 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d145fe82-e716-418e-990b-c139edc82fa5" containerName="ceilometer-central-agent" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.746619 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d145fe82-e716-418e-990b-c139edc82fa5" containerName="ceilometer-central-agent" Mar 01 10:02:05 crc kubenswrapper[4792]: E0301 10:02:05.746629 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d145fe82-e716-418e-990b-c139edc82fa5" containerName="proxy-httpd" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.746634 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d145fe82-e716-418e-990b-c139edc82fa5" containerName="proxy-httpd" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.746788 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d145fe82-e716-418e-990b-c139edc82fa5" containerName="ceilometer-notification-agent" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.746800 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e20671b-de40-40a8-8237-2bd9940b9af5" containerName="manila-share" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.746815 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d145fe82-e716-418e-990b-c139edc82fa5" containerName="proxy-httpd" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.746823 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d145fe82-e716-418e-990b-c139edc82fa5" containerName="ceilometer-central-agent" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.746832 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d145fe82-e716-418e-990b-c139edc82fa5" containerName="sg-core" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.746841 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e20671b-de40-40a8-8237-2bd9940b9af5" containerName="probe" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.748359 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.757424 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.757474 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.757432 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.770549 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63238274-bc2e-4686-8371-e891944269f9-run-httpd\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.770605 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63238274-bc2e-4686-8371-e891944269f9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.770628 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63238274-bc2e-4686-8371-e891944269f9-config-data\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.770688 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4g9h\" (UniqueName: \"kubernetes.io/projected/63238274-bc2e-4686-8371-e891944269f9-kube-api-access-k4g9h\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.770712 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63238274-bc2e-4686-8371-e891944269f9-scripts\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.770758 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63238274-bc2e-4686-8371-e891944269f9-log-httpd\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.770776 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/63238274-bc2e-4686-8371-e891944269f9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.770803 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63238274-bc2e-4686-8371-e891944269f9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.794536 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.872538 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4g9h\" (UniqueName: \"kubernetes.io/projected/63238274-bc2e-4686-8371-e891944269f9-kube-api-access-k4g9h\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.872607 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63238274-bc2e-4686-8371-e891944269f9-scripts\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.872705 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63238274-bc2e-4686-8371-e891944269f9-log-httpd\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.872736 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/63238274-bc2e-4686-8371-e891944269f9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.872793 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63238274-bc2e-4686-8371-e891944269f9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.872832 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63238274-bc2e-4686-8371-e891944269f9-run-httpd\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.872886 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63238274-bc2e-4686-8371-e891944269f9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.872929 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63238274-bc2e-4686-8371-e891944269f9-config-data\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.873162 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63238274-bc2e-4686-8371-e891944269f9-log-httpd\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.873401 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63238274-bc2e-4686-8371-e891944269f9-run-httpd\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.880801 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63238274-bc2e-4686-8371-e891944269f9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.889514 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63238274-bc2e-4686-8371-e891944269f9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.890683 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/63238274-bc2e-4686-8371-e891944269f9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.890892 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63238274-bc2e-4686-8371-e891944269f9-config-data\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.893854 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63238274-bc2e-4686-8371-e891944269f9-scripts\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.901399 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4g9h\" (UniqueName: \"kubernetes.io/projected/63238274-bc2e-4686-8371-e891944269f9-kube-api-access-k4g9h\") pod \"ceilometer-0\" (UID: \"63238274-bc2e-4686-8371-e891944269f9\") " pod="openstack/ceilometer-0" Mar 01 10:02:05 crc kubenswrapper[4792]: I0301 10:02:05.993024 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539322-cwkrx" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.085786 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xjg6\" (UniqueName: \"kubernetes.io/projected/d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b-kube-api-access-7xjg6\") pod \"d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b\" (UID: \"d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b\") " Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.089830 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b-kube-api-access-7xjg6" (OuterVolumeSpecName: "kube-api-access-7xjg6") pod "d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b" (UID: "d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b"). InnerVolumeSpecName "kube-api-access-7xjg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.115342 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.188275 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xjg6\" (UniqueName: \"kubernetes.io/projected/d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b-kube-api-access-7xjg6\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.324289 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539322-cwkrx" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.325857 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539322-cwkrx" event={"ID":"d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b","Type":"ContainerDied","Data":"5a7d9c9a77b56bc28c264fe19e619434f70526367aaa6c6a9caab4e4dd2ae71c"} Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.325900 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a7d9c9a77b56bc28c264fe19e619434f70526367aaa6c6a9caab4e4dd2ae71c" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.350406 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.415424 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539316-8swf8"] Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.455508 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.480560 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539316-8swf8"] Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.514133 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.527873 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Mar 01 10:02:06 crc kubenswrapper[4792]: E0301 10:02:06.528298 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b" containerName="oc" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.528310 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b" containerName="oc" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.528470 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b" containerName="oc" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.529387 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.531849 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.555762 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.601722 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03462f2f-874f-496a-934b-9fa6e2c55850-scripts\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.601769 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03462f2f-874f-496a-934b-9fa6e2c55850-config-data\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.601795 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/03462f2f-874f-496a-934b-9fa6e2c55850-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.601858 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03462f2f-874f-496a-934b-9fa6e2c55850-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.601896 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/03462f2f-874f-496a-934b-9fa6e2c55850-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.602086 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd7l6\" (UniqueName: \"kubernetes.io/projected/03462f2f-874f-496a-934b-9fa6e2c55850-kube-api-access-rd7l6\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.602119 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/03462f2f-874f-496a-934b-9fa6e2c55850-ceph\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.602326 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03462f2f-874f-496a-934b-9fa6e2c55850-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.628414 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.704222 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03462f2f-874f-496a-934b-9fa6e2c55850-scripts\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.704282 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03462f2f-874f-496a-934b-9fa6e2c55850-config-data\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.704323 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/03462f2f-874f-496a-934b-9fa6e2c55850-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.704366 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03462f2f-874f-496a-934b-9fa6e2c55850-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.704419 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/03462f2f-874f-496a-934b-9fa6e2c55850-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.704492 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd7l6\" (UniqueName: \"kubernetes.io/projected/03462f2f-874f-496a-934b-9fa6e2c55850-kube-api-access-rd7l6\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.704517 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/03462f2f-874f-496a-934b-9fa6e2c55850-ceph\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.704660 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03462f2f-874f-496a-934b-9fa6e2c55850-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.706344 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/03462f2f-874f-496a-934b-9fa6e2c55850-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.706403 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/03462f2f-874f-496a-934b-9fa6e2c55850-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.711239 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03462f2f-874f-496a-934b-9fa6e2c55850-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.711285 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/03462f2f-874f-496a-934b-9fa6e2c55850-ceph\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.715866 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03462f2f-874f-496a-934b-9fa6e2c55850-config-data\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.716413 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03462f2f-874f-496a-934b-9fa6e2c55850-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.717852 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03462f2f-874f-496a-934b-9fa6e2c55850-scripts\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.725512 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd7l6\" (UniqueName: \"kubernetes.io/projected/03462f2f-874f-496a-934b-9fa6e2c55850-kube-api-access-rd7l6\") pod \"manila-share-share1-0\" (UID: \"03462f2f-874f-496a-934b-9fa6e2c55850\") " pod="openstack/manila-share-share1-0" Mar 01 10:02:06 crc kubenswrapper[4792]: I0301 10:02:06.868608 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 01 10:02:07 crc kubenswrapper[4792]: I0301 10:02:07.323611 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 01 10:02:07 crc kubenswrapper[4792]: I0301 10:02:07.362657 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63238274-bc2e-4686-8371-e891944269f9","Type":"ContainerStarted","Data":"cb5cb3199883bd66de434a4d6ee46e1ac35df0ce429d377c147e61cfcded4c58"} Mar 01 10:02:07 crc kubenswrapper[4792]: I0301 10:02:07.363447 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63238274-bc2e-4686-8371-e891944269f9","Type":"ContainerStarted","Data":"8e22199d500f0805c8cb6570bc524a00a0d8e763a509e63429d8035d5688ce87"} Mar 01 10:02:07 crc kubenswrapper[4792]: I0301 10:02:07.364670 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"03462f2f-874f-496a-934b-9fa6e2c55850","Type":"ContainerStarted","Data":"d3e1d07f4f3d51eda7e71efc89eb8ff096e9663bfa6ae3162512081082bb1fdd"} Mar 01 10:02:07 crc kubenswrapper[4792]: I0301 10:02:07.419523 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e20671b-de40-40a8-8237-2bd9940b9af5" path="/var/lib/kubelet/pods/2e20671b-de40-40a8-8237-2bd9940b9af5/volumes" Mar 01 10:02:07 crc kubenswrapper[4792]: I0301 10:02:07.420397 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d145fe82-e716-418e-990b-c139edc82fa5" path="/var/lib/kubelet/pods/d145fe82-e716-418e-990b-c139edc82fa5/volumes" Mar 01 10:02:07 crc kubenswrapper[4792]: I0301 10:02:07.421065 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9273c82-13c3-43c5-b90e-16fdb09f082e" path="/var/lib/kubelet/pods/e9273c82-13c3-43c5-b90e-16fdb09f082e/volumes" Mar 01 10:02:08 crc kubenswrapper[4792]: I0301 10:02:08.375747 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63238274-bc2e-4686-8371-e891944269f9","Type":"ContainerStarted","Data":"d6c30ffe6f1157993f4a62a4542ac5064a07854da7cd41adb3804ed8d39849e0"} Mar 01 10:02:08 crc kubenswrapper[4792]: I0301 10:02:08.378891 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"03462f2f-874f-496a-934b-9fa6e2c55850","Type":"ContainerStarted","Data":"8795b79f2c94dd66df278ff8afdb3d312a999dfa7aaefa0d77fb6d40b12819e4"} Mar 01 10:02:08 crc kubenswrapper[4792]: I0301 10:02:08.378956 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"03462f2f-874f-496a-934b-9fa6e2c55850","Type":"ContainerStarted","Data":"6a042cba6eb3d9aaa9069cad10940f945ae7ed290d26f8640b67514c0c058ab7"} Mar 01 10:02:08 crc kubenswrapper[4792]: I0301 10:02:08.411717 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=2.411699345 podStartE2EDuration="2.411699345s" podCreationTimestamp="2026-03-01 10:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 10:02:08.398854175 +0000 UTC m=+3257.640733372" watchObservedRunningTime="2026-03-01 10:02:08.411699345 +0000 UTC m=+3257.653578542" Mar 01 10:02:08 crc kubenswrapper[4792]: I0301 10:02:08.857088 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Mar 01 10:02:09 crc kubenswrapper[4792]: I0301 10:02:09.391753 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63238274-bc2e-4686-8371-e891944269f9","Type":"ContainerStarted","Data":"8d216a9216980eb8491eb4fee4c11bec75c712da7d6e97f8a3cdee3fb22149e7"} Mar 01 10:02:10 crc kubenswrapper[4792]: I0301 10:02:10.406751 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63238274-bc2e-4686-8371-e891944269f9","Type":"ContainerStarted","Data":"789df2a55bd98a76a43c542f546f7179c9a04fc9a4b5dec0e2e4ed322abb8b0f"} Mar 01 10:02:10 crc kubenswrapper[4792]: I0301 10:02:10.407952 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 01 10:02:10 crc kubenswrapper[4792]: I0301 10:02:10.437794 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.285034405 podStartE2EDuration="5.437777683s" podCreationTimestamp="2026-03-01 10:02:05 +0000 UTC" firstStartedPulling="2026-03-01 10:02:06.650140301 +0000 UTC m=+3255.892019498" lastFinishedPulling="2026-03-01 10:02:09.802883579 +0000 UTC m=+3259.044762776" observedRunningTime="2026-03-01 10:02:10.431716192 +0000 UTC m=+3259.673595399" watchObservedRunningTime="2026-03-01 10:02:10.437777683 +0000 UTC m=+3259.679656890" Mar 01 10:02:12 crc kubenswrapper[4792]: I0301 10:02:12.409412 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:02:12 crc kubenswrapper[4792]: E0301 10:02:12.410145 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:02:14 crc kubenswrapper[4792]: I0301 10:02:14.321800 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-689c76c966-7mbkl" podUID="67e060ef-1cc6-4b39-8622-bbcc183bdda0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.12:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.12:8443: connect: connection refused" Mar 01 10:02:14 crc kubenswrapper[4792]: I0301 10:02:14.321926 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:02:16 crc kubenswrapper[4792]: I0301 10:02:16.870274 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Mar 01 10:02:20 crc kubenswrapper[4792]: I0301 10:02:20.552611 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Mar 01 10:02:21 crc kubenswrapper[4792]: W0301 10:02:21.546237 4792 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1097b0e_1156_4f3c_b1e9_6f7b83d0e07b.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1097b0e_1156_4f3c_b1e9_6f7b83d0e07b.slice: no such file or directory Mar 01 10:02:21 crc kubenswrapper[4792]: W0301 10:02:21.556371 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd145fe82_e716_418e_990b_c139edc82fa5.slice/crio-397b59d785e3917d16a895a2d5a3a1112dd5a93072735c2c87d76a8c570eace8.scope WatchSource:0}: Error finding container 397b59d785e3917d16a895a2d5a3a1112dd5a93072735c2c87d76a8c570eace8: Status 404 returned error can't find the container with id 397b59d785e3917d16a895a2d5a3a1112dd5a93072735c2c87d76a8c570eace8 Mar 01 10:02:21 crc kubenswrapper[4792]: W0301 10:02:21.559824 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd145fe82_e716_418e_990b_c139edc82fa5.slice/crio-1ae470c9722b44df4620ea0f02bb04d1aaff8e37f4b00ed39ee445412fbd0e34.scope WatchSource:0}: Error finding container 1ae470c9722b44df4620ea0f02bb04d1aaff8e37f4b00ed39ee445412fbd0e34: Status 404 returned error can't find the container with id 1ae470c9722b44df4620ea0f02bb04d1aaff8e37f4b00ed39ee445412fbd0e34 Mar 01 10:02:21 crc kubenswrapper[4792]: W0301 10:02:21.560281 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd145fe82_e716_418e_990b_c139edc82fa5.slice/crio-db37dac072d6d2caae6fc34c918a3cef14b10a312e58745dfb92fa4708db31e6.scope WatchSource:0}: Error finding container db37dac072d6d2caae6fc34c918a3cef14b10a312e58745dfb92fa4708db31e6: Status 404 returned error can't find the container with id db37dac072d6d2caae6fc34c918a3cef14b10a312e58745dfb92fa4708db31e6 Mar 01 10:02:21 crc kubenswrapper[4792]: E0301 10:02:21.828521 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e20671b_de40_40a8_8237_2bd9940b9af5.slice/crio-conmon-f8623324113a06ed0f840a7abbd929eaf57c541d1aae8f4b05f1bf6b431b24fe.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e20671b_de40_40a8_8237_2bd9940b9af5.slice/crio-conmon-e9350b459945620c5d55700a03e5aa49740d1a4854b26be997bf075b28191bfe.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e20671b_de40_40a8_8237_2bd9940b9af5.slice/crio-f8623324113a06ed0f840a7abbd929eaf57c541d1aae8f4b05f1bf6b431b24fe.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd145fe82_e716_418e_990b_c139edc82fa5.slice/crio-ce25965b24241e019e5342dcd3aa81c21c3982896b8fa1aed06c530d8002ff34\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e20671b_de40_40a8_8237_2bd9940b9af5.slice/crio-e9350b459945620c5d55700a03e5aa49740d1a4854b26be997bf075b28191bfe.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd145fe82_e716_418e_990b_c139edc82fa5.slice/crio-conmon-60637fe04075519a9c42d81a400a742249a0bd0cb88a9744d6cb782e2e26c189.scope\": RecentStats: unable to find data in memory cache], [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e20671b_de40_40a8_8237_2bd9940b9af5.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67e060ef_1cc6_4b39_8622_bbcc183bdda0.slice/crio-7eefa3b27a0172eca37fa354a49dffdfcea57a296cf27db9b99db7f3391b92ce.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd145fe82_e716_418e_990b_c139edc82fa5.slice/crio-conmon-1ae470c9722b44df4620ea0f02bb04d1aaff8e37f4b00ed39ee445412fbd0e34.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e20671b_de40_40a8_8237_2bd9940b9af5.slice/crio-4f751b65cd13e8f9d7bbd49d2e059dd52782b4f71c9b59600e77ee78c852e20a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd145fe82_e716_418e_990b_c139edc82fa5.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd145fe82_e716_418e_990b_c139edc82fa5.slice/crio-conmon-db37dac072d6d2caae6fc34c918a3cef14b10a312e58745dfb92fa4708db31e6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd145fe82_e716_418e_990b_c139edc82fa5.slice/crio-60637fe04075519a9c42d81a400a742249a0bd0cb88a9744d6cb782e2e26c189.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67e060ef_1cc6_4b39_8622_bbcc183bdda0.slice/crio-conmon-7eefa3b27a0172eca37fa354a49dffdfcea57a296cf27db9b99db7f3391b92ce.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd145fe82_e716_418e_990b_c139edc82fa5.slice/crio-conmon-397b59d785e3917d16a895a2d5a3a1112dd5a93072735c2c87d76a8c570eace8.scope\": RecentStats: unable to find data in memory cache]" Mar 01 10:02:21 crc kubenswrapper[4792]: I0301 10:02:21.940973 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.011479 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67e060ef-1cc6-4b39-8622-bbcc183bdda0-combined-ca-bundle\") pod \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.011857 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/67e060ef-1cc6-4b39-8622-bbcc183bdda0-horizon-secret-key\") pod \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.011917 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/67e060ef-1cc6-4b39-8622-bbcc183bdda0-horizon-tls-certs\") pod \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.011951 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rv6tz\" (UniqueName: \"kubernetes.io/projected/67e060ef-1cc6-4b39-8622-bbcc183bdda0-kube-api-access-rv6tz\") pod \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.012013 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67e060ef-1cc6-4b39-8622-bbcc183bdda0-logs\") pod \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.012112 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67e060ef-1cc6-4b39-8622-bbcc183bdda0-scripts\") pod \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.012168 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67e060ef-1cc6-4b39-8622-bbcc183bdda0-config-data\") pod \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\" (UID: \"67e060ef-1cc6-4b39-8622-bbcc183bdda0\") " Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.013782 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67e060ef-1cc6-4b39-8622-bbcc183bdda0-logs" (OuterVolumeSpecName: "logs") pod "67e060ef-1cc6-4b39-8622-bbcc183bdda0" (UID: "67e060ef-1cc6-4b39-8622-bbcc183bdda0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.031185 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67e060ef-1cc6-4b39-8622-bbcc183bdda0-kube-api-access-rv6tz" (OuterVolumeSpecName: "kube-api-access-rv6tz") pod "67e060ef-1cc6-4b39-8622-bbcc183bdda0" (UID: "67e060ef-1cc6-4b39-8622-bbcc183bdda0"). InnerVolumeSpecName "kube-api-access-rv6tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.031629 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67e060ef-1cc6-4b39-8622-bbcc183bdda0-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "67e060ef-1cc6-4b39-8622-bbcc183bdda0" (UID: "67e060ef-1cc6-4b39-8622-bbcc183bdda0"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.034444 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67e060ef-1cc6-4b39-8622-bbcc183bdda0-config-data" (OuterVolumeSpecName: "config-data") pod "67e060ef-1cc6-4b39-8622-bbcc183bdda0" (UID: "67e060ef-1cc6-4b39-8622-bbcc183bdda0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.040092 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67e060ef-1cc6-4b39-8622-bbcc183bdda0-scripts" (OuterVolumeSpecName: "scripts") pod "67e060ef-1cc6-4b39-8622-bbcc183bdda0" (UID: "67e060ef-1cc6-4b39-8622-bbcc183bdda0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.047263 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67e060ef-1cc6-4b39-8622-bbcc183bdda0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67e060ef-1cc6-4b39-8622-bbcc183bdda0" (UID: "67e060ef-1cc6-4b39-8622-bbcc183bdda0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.081632 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67e060ef-1cc6-4b39-8622-bbcc183bdda0-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "67e060ef-1cc6-4b39-8622-bbcc183bdda0" (UID: "67e060ef-1cc6-4b39-8622-bbcc183bdda0"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.114229 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67e060ef-1cc6-4b39-8622-bbcc183bdda0-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.114267 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67e060ef-1cc6-4b39-8622-bbcc183bdda0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.114278 4792 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/67e060ef-1cc6-4b39-8622-bbcc183bdda0-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.114286 4792 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/67e060ef-1cc6-4b39-8622-bbcc183bdda0-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.114296 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rv6tz\" (UniqueName: \"kubernetes.io/projected/67e060ef-1cc6-4b39-8622-bbcc183bdda0-kube-api-access-rv6tz\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.114307 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67e060ef-1cc6-4b39-8622-bbcc183bdda0-logs\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.114314 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67e060ef-1cc6-4b39-8622-bbcc183bdda0-scripts\") on node \"crc\" DevicePath \"\"" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.547959 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-689c76c966-7mbkl" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.548341 4792 generic.go:334] "Generic (PLEG): container finished" podID="67e060ef-1cc6-4b39-8622-bbcc183bdda0" containerID="7eefa3b27a0172eca37fa354a49dffdfcea57a296cf27db9b99db7f3391b92ce" exitCode=137 Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.548223 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-689c76c966-7mbkl" event={"ID":"67e060ef-1cc6-4b39-8622-bbcc183bdda0","Type":"ContainerDied","Data":"7eefa3b27a0172eca37fa354a49dffdfcea57a296cf27db9b99db7f3391b92ce"} Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.551595 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-689c76c966-7mbkl" event={"ID":"67e060ef-1cc6-4b39-8622-bbcc183bdda0","Type":"ContainerDied","Data":"ca973dc9b124eaa6124a0d372262dadb60856c6d51900f904c6c257a734ebfb0"} Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.551625 4792 scope.go:117] "RemoveContainer" containerID="809c1ff3a35d23863d9e774c5c257d32ce27f54609e7871a540c404967d6246b" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.611226 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-689c76c966-7mbkl"] Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.627768 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-689c76c966-7mbkl"] Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.748432 4792 scope.go:117] "RemoveContainer" containerID="7eefa3b27a0172eca37fa354a49dffdfcea57a296cf27db9b99db7f3391b92ce" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.781586 4792 scope.go:117] "RemoveContainer" containerID="809c1ff3a35d23863d9e774c5c257d32ce27f54609e7871a540c404967d6246b" Mar 01 10:02:22 crc kubenswrapper[4792]: E0301 10:02:22.782729 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"809c1ff3a35d23863d9e774c5c257d32ce27f54609e7871a540c404967d6246b\": container with ID starting with 809c1ff3a35d23863d9e774c5c257d32ce27f54609e7871a540c404967d6246b not found: ID does not exist" containerID="809c1ff3a35d23863d9e774c5c257d32ce27f54609e7871a540c404967d6246b" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.782774 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"809c1ff3a35d23863d9e774c5c257d32ce27f54609e7871a540c404967d6246b"} err="failed to get container status \"809c1ff3a35d23863d9e774c5c257d32ce27f54609e7871a540c404967d6246b\": rpc error: code = NotFound desc = could not find container \"809c1ff3a35d23863d9e774c5c257d32ce27f54609e7871a540c404967d6246b\": container with ID starting with 809c1ff3a35d23863d9e774c5c257d32ce27f54609e7871a540c404967d6246b not found: ID does not exist" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.782803 4792 scope.go:117] "RemoveContainer" containerID="7eefa3b27a0172eca37fa354a49dffdfcea57a296cf27db9b99db7f3391b92ce" Mar 01 10:02:22 crc kubenswrapper[4792]: E0301 10:02:22.783430 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eefa3b27a0172eca37fa354a49dffdfcea57a296cf27db9b99db7f3391b92ce\": container with ID starting with 7eefa3b27a0172eca37fa354a49dffdfcea57a296cf27db9b99db7f3391b92ce not found: ID does not exist" containerID="7eefa3b27a0172eca37fa354a49dffdfcea57a296cf27db9b99db7f3391b92ce" Mar 01 10:02:22 crc kubenswrapper[4792]: I0301 10:02:22.783462 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eefa3b27a0172eca37fa354a49dffdfcea57a296cf27db9b99db7f3391b92ce"} err="failed to get container status \"7eefa3b27a0172eca37fa354a49dffdfcea57a296cf27db9b99db7f3391b92ce\": rpc error: code = NotFound desc = could not find container \"7eefa3b27a0172eca37fa354a49dffdfcea57a296cf27db9b99db7f3391b92ce\": container with ID starting with 7eefa3b27a0172eca37fa354a49dffdfcea57a296cf27db9b99db7f3391b92ce not found: ID does not exist" Mar 01 10:02:23 crc kubenswrapper[4792]: I0301 10:02:23.424192 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67e060ef-1cc6-4b39-8622-bbcc183bdda0" path="/var/lib/kubelet/pods/67e060ef-1cc6-4b39-8622-bbcc183bdda0/volumes" Mar 01 10:02:24 crc kubenswrapper[4792]: I0301 10:02:24.408732 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:02:24 crc kubenswrapper[4792]: E0301 10:02:24.409259 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:02:28 crc kubenswrapper[4792]: I0301 10:02:28.411300 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Mar 01 10:02:32 crc kubenswrapper[4792]: I0301 10:02:32.164213 4792 scope.go:117] "RemoveContainer" containerID="3aeb9f44a1b454186ac24af4b6119b2ea036267663153223c161c28c89a3a926" Mar 01 10:02:36 crc kubenswrapper[4792]: I0301 10:02:36.133114 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 01 10:02:38 crc kubenswrapper[4792]: I0301 10:02:38.408926 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:02:38 crc kubenswrapper[4792]: E0301 10:02:38.410151 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:02:53 crc kubenswrapper[4792]: I0301 10:02:53.411091 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:02:53 crc kubenswrapper[4792]: E0301 10:02:53.411921 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:03:06 crc kubenswrapper[4792]: I0301 10:03:06.409183 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:03:06 crc kubenswrapper[4792]: E0301 10:03:06.410069 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:03:21 crc kubenswrapper[4792]: I0301 10:03:21.416371 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:03:21 crc kubenswrapper[4792]: E0301 10:03:21.417348 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:03:34 crc kubenswrapper[4792]: I0301 10:03:34.409793 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:03:34 crc kubenswrapper[4792]: E0301 10:03:34.410526 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.014732 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 01 10:03:38 crc kubenswrapper[4792]: E0301 10:03:38.015774 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e060ef-1cc6-4b39-8622-bbcc183bdda0" containerName="horizon-log" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.015793 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e060ef-1cc6-4b39-8622-bbcc183bdda0" containerName="horizon-log" Mar 01 10:03:38 crc kubenswrapper[4792]: E0301 10:03:38.015836 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e060ef-1cc6-4b39-8622-bbcc183bdda0" containerName="horizon" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.015848 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e060ef-1cc6-4b39-8622-bbcc183bdda0" containerName="horizon" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.016105 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="67e060ef-1cc6-4b39-8622-bbcc183bdda0" containerName="horizon-log" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.016129 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="67e060ef-1cc6-4b39-8622-bbcc183bdda0" containerName="horizon" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.016962 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.018763 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.018993 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.019260 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-nl48r" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.021397 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.041305 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.107748 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.108087 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.108246 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.108451 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.108758 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-config-data\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.108838 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.109075 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.109196 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.109362 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slzdj\" (UniqueName: \"kubernetes.io/projected/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-kube-api-access-slzdj\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.210651 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.210743 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-config-data\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.210779 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.210862 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.210886 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.210947 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slzdj\" (UniqueName: \"kubernetes.io/projected/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-kube-api-access-slzdj\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.211009 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.211403 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.211637 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.212271 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.212985 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.213135 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.213365 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-config-data\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.213453 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.216949 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.221806 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.221987 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.233077 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slzdj\" (UniqueName: \"kubernetes.io/projected/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-kube-api-access-slzdj\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.243597 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.341408 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 01 10:03:38 crc kubenswrapper[4792]: I0301 10:03:38.799738 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 01 10:03:38 crc kubenswrapper[4792]: W0301 10:03:38.804981 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee1c75ce_61f7_4ce5_a757_b7405d7135bd.slice/crio-56c23af9ad5afce4e187f9687e375e2437b0d33726c9ce5a2f51b334bc434432 WatchSource:0}: Error finding container 56c23af9ad5afce4e187f9687e375e2437b0d33726c9ce5a2f51b334bc434432: Status 404 returned error can't find the container with id 56c23af9ad5afce4e187f9687e375e2437b0d33726c9ce5a2f51b334bc434432 Mar 01 10:03:39 crc kubenswrapper[4792]: I0301 10:03:39.730460 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ee1c75ce-61f7-4ce5-a757-b7405d7135bd","Type":"ContainerStarted","Data":"56c23af9ad5afce4e187f9687e375e2437b0d33726c9ce5a2f51b334bc434432"} Mar 01 10:03:47 crc kubenswrapper[4792]: I0301 10:03:47.408708 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:03:47 crc kubenswrapper[4792]: E0301 10:03:47.409600 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:04:00 crc kubenswrapper[4792]: I0301 10:04:00.138201 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539324-wjq94"] Mar 01 10:04:00 crc kubenswrapper[4792]: I0301 10:04:00.140433 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539324-wjq94" Mar 01 10:04:00 crc kubenswrapper[4792]: I0301 10:04:00.142992 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:04:00 crc kubenswrapper[4792]: I0301 10:04:00.143069 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:04:00 crc kubenswrapper[4792]: I0301 10:04:00.143092 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:04:00 crc kubenswrapper[4792]: I0301 10:04:00.160320 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539324-wjq94"] Mar 01 10:04:00 crc kubenswrapper[4792]: I0301 10:04:00.179093 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txf4b\" (UniqueName: \"kubernetes.io/projected/41b7071d-7243-4ca4-82e6-c153c3001d1f-kube-api-access-txf4b\") pod \"auto-csr-approver-29539324-wjq94\" (UID: \"41b7071d-7243-4ca4-82e6-c153c3001d1f\") " pod="openshift-infra/auto-csr-approver-29539324-wjq94" Mar 01 10:04:00 crc kubenswrapper[4792]: I0301 10:04:00.280754 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txf4b\" (UniqueName: \"kubernetes.io/projected/41b7071d-7243-4ca4-82e6-c153c3001d1f-kube-api-access-txf4b\") pod \"auto-csr-approver-29539324-wjq94\" (UID: \"41b7071d-7243-4ca4-82e6-c153c3001d1f\") " pod="openshift-infra/auto-csr-approver-29539324-wjq94" Mar 01 10:04:00 crc kubenswrapper[4792]: I0301 10:04:00.301829 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txf4b\" (UniqueName: \"kubernetes.io/projected/41b7071d-7243-4ca4-82e6-c153c3001d1f-kube-api-access-txf4b\") pod \"auto-csr-approver-29539324-wjq94\" (UID: \"41b7071d-7243-4ca4-82e6-c153c3001d1f\") " pod="openshift-infra/auto-csr-approver-29539324-wjq94" Mar 01 10:04:00 crc kubenswrapper[4792]: I0301 10:04:00.460518 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539324-wjq94" Mar 01 10:04:02 crc kubenswrapper[4792]: I0301 10:04:02.410757 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:04:02 crc kubenswrapper[4792]: E0301 10:04:02.411364 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:04:12 crc kubenswrapper[4792]: E0301 10:04:12.419075 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 01 10:04:12 crc kubenswrapper[4792]: E0301 10:04:12.420680 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-slzdj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(ee1c75ce-61f7-4ce5-a757-b7405d7135bd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 01 10:04:12 crc kubenswrapper[4792]: E0301 10:04:12.421855 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="ee1c75ce-61f7-4ce5-a757-b7405d7135bd" Mar 01 10:04:12 crc kubenswrapper[4792]: I0301 10:04:12.844984 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539324-wjq94"] Mar 01 10:04:13 crc kubenswrapper[4792]: I0301 10:04:13.050632 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539324-wjq94" event={"ID":"41b7071d-7243-4ca4-82e6-c153c3001d1f","Type":"ContainerStarted","Data":"05ecd0e3b57f89e85c7cc446a097e5719dfaf3e02c4997b31325f714bb21377a"} Mar 01 10:04:13 crc kubenswrapper[4792]: E0301 10:04:13.052293 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="ee1c75ce-61f7-4ce5-a757-b7405d7135bd" Mar 01 10:04:13 crc kubenswrapper[4792]: I0301 10:04:13.408303 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:04:13 crc kubenswrapper[4792]: E0301 10:04:13.408631 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:04:15 crc kubenswrapper[4792]: I0301 10:04:15.072700 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539324-wjq94" event={"ID":"41b7071d-7243-4ca4-82e6-c153c3001d1f","Type":"ContainerStarted","Data":"e004ff88b9b0e0f691af76b372e4044089dce1eaaf6619be9de9fed1b4d58c18"} Mar 01 10:04:15 crc kubenswrapper[4792]: I0301 10:04:15.094893 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29539324-wjq94" podStartSLOduration=14.252390903 podStartE2EDuration="15.094871951s" podCreationTimestamp="2026-03-01 10:04:00 +0000 UTC" firstStartedPulling="2026-03-01 10:04:12.842069752 +0000 UTC m=+3382.083948949" lastFinishedPulling="2026-03-01 10:04:13.6845508 +0000 UTC m=+3382.926429997" observedRunningTime="2026-03-01 10:04:15.088141873 +0000 UTC m=+3384.330021100" watchObservedRunningTime="2026-03-01 10:04:15.094871951 +0000 UTC m=+3384.336751158" Mar 01 10:04:16 crc kubenswrapper[4792]: I0301 10:04:16.083872 4792 generic.go:334] "Generic (PLEG): container finished" podID="41b7071d-7243-4ca4-82e6-c153c3001d1f" containerID="e004ff88b9b0e0f691af76b372e4044089dce1eaaf6619be9de9fed1b4d58c18" exitCode=0 Mar 01 10:04:16 crc kubenswrapper[4792]: I0301 10:04:16.083982 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539324-wjq94" event={"ID":"41b7071d-7243-4ca4-82e6-c153c3001d1f","Type":"ContainerDied","Data":"e004ff88b9b0e0f691af76b372e4044089dce1eaaf6619be9de9fed1b4d58c18"} Mar 01 10:04:17 crc kubenswrapper[4792]: I0301 10:04:17.474127 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539324-wjq94" Mar 01 10:04:17 crc kubenswrapper[4792]: I0301 10:04:17.667296 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txf4b\" (UniqueName: \"kubernetes.io/projected/41b7071d-7243-4ca4-82e6-c153c3001d1f-kube-api-access-txf4b\") pod \"41b7071d-7243-4ca4-82e6-c153c3001d1f\" (UID: \"41b7071d-7243-4ca4-82e6-c153c3001d1f\") " Mar 01 10:04:17 crc kubenswrapper[4792]: I0301 10:04:17.673133 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41b7071d-7243-4ca4-82e6-c153c3001d1f-kube-api-access-txf4b" (OuterVolumeSpecName: "kube-api-access-txf4b") pod "41b7071d-7243-4ca4-82e6-c153c3001d1f" (UID: "41b7071d-7243-4ca4-82e6-c153c3001d1f"). InnerVolumeSpecName "kube-api-access-txf4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:04:17 crc kubenswrapper[4792]: I0301 10:04:17.770895 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txf4b\" (UniqueName: \"kubernetes.io/projected/41b7071d-7243-4ca4-82e6-c153c3001d1f-kube-api-access-txf4b\") on node \"crc\" DevicePath \"\"" Mar 01 10:04:18 crc kubenswrapper[4792]: I0301 10:04:18.101637 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539324-wjq94" event={"ID":"41b7071d-7243-4ca4-82e6-c153c3001d1f","Type":"ContainerDied","Data":"05ecd0e3b57f89e85c7cc446a097e5719dfaf3e02c4997b31325f714bb21377a"} Mar 01 10:04:18 crc kubenswrapper[4792]: I0301 10:04:18.101673 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05ecd0e3b57f89e85c7cc446a097e5719dfaf3e02c4997b31325f714bb21377a" Mar 01 10:04:18 crc kubenswrapper[4792]: I0301 10:04:18.101702 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539324-wjq94" Mar 01 10:04:18 crc kubenswrapper[4792]: I0301 10:04:18.155630 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539318-d9tmc"] Mar 01 10:04:18 crc kubenswrapper[4792]: I0301 10:04:18.164435 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539318-d9tmc"] Mar 01 10:04:19 crc kubenswrapper[4792]: I0301 10:04:19.419242 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf3aad6a-fbd9-4a24-a489-33507709811b" path="/var/lib/kubelet/pods/bf3aad6a-fbd9-4a24-a489-33507709811b/volumes" Mar 01 10:04:25 crc kubenswrapper[4792]: I0301 10:04:25.885297 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 01 10:04:27 crc kubenswrapper[4792]: I0301 10:04:27.190023 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ee1c75ce-61f7-4ce5-a757-b7405d7135bd","Type":"ContainerStarted","Data":"0b1c921f1338ea9b8f3dd9b08a6d658d40119ab101643c7964b03d38bfa73f47"} Mar 01 10:04:27 crc kubenswrapper[4792]: I0301 10:04:27.218384 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.143779626 podStartE2EDuration="51.218366932s" podCreationTimestamp="2026-03-01 10:03:36 +0000 UTC" firstStartedPulling="2026-03-01 10:03:38.807720917 +0000 UTC m=+3348.049600114" lastFinishedPulling="2026-03-01 10:04:25.882308223 +0000 UTC m=+3395.124187420" observedRunningTime="2026-03-01 10:04:27.208047115 +0000 UTC m=+3396.449926322" watchObservedRunningTime="2026-03-01 10:04:27.218366932 +0000 UTC m=+3396.460246129" Mar 01 10:04:27 crc kubenswrapper[4792]: I0301 10:04:27.408535 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:04:27 crc kubenswrapper[4792]: E0301 10:04:27.408804 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:04:32 crc kubenswrapper[4792]: I0301 10:04:32.415460 4792 scope.go:117] "RemoveContainer" containerID="7152ca7878f74975420b6650bf54cd79c2b676e3ea865b2cbf55b92459e46fa6" Mar 01 10:04:42 crc kubenswrapper[4792]: I0301 10:04:42.409453 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:04:42 crc kubenswrapper[4792]: E0301 10:04:42.410332 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:04:56 crc kubenswrapper[4792]: I0301 10:04:56.408353 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:04:56 crc kubenswrapper[4792]: E0301 10:04:56.409595 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:05:00 crc kubenswrapper[4792]: I0301 10:05:00.106009 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-smtml"] Mar 01 10:05:00 crc kubenswrapper[4792]: E0301 10:05:00.107050 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41b7071d-7243-4ca4-82e6-c153c3001d1f" containerName="oc" Mar 01 10:05:00 crc kubenswrapper[4792]: I0301 10:05:00.107066 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="41b7071d-7243-4ca4-82e6-c153c3001d1f" containerName="oc" Mar 01 10:05:00 crc kubenswrapper[4792]: I0301 10:05:00.107313 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="41b7071d-7243-4ca4-82e6-c153c3001d1f" containerName="oc" Mar 01 10:05:00 crc kubenswrapper[4792]: I0301 10:05:00.110125 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smtml" Mar 01 10:05:00 crc kubenswrapper[4792]: I0301 10:05:00.128593 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-smtml"] Mar 01 10:05:00 crc kubenswrapper[4792]: I0301 10:05:00.289959 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn2lb\" (UniqueName: \"kubernetes.io/projected/b24525d9-4be1-47e3-9588-e0747410912c-kube-api-access-gn2lb\") pod \"redhat-operators-smtml\" (UID: \"b24525d9-4be1-47e3-9588-e0747410912c\") " pod="openshift-marketplace/redhat-operators-smtml" Mar 01 10:05:00 crc kubenswrapper[4792]: I0301 10:05:00.290441 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b24525d9-4be1-47e3-9588-e0747410912c-utilities\") pod \"redhat-operators-smtml\" (UID: \"b24525d9-4be1-47e3-9588-e0747410912c\") " pod="openshift-marketplace/redhat-operators-smtml" Mar 01 10:05:00 crc kubenswrapper[4792]: I0301 10:05:00.290505 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b24525d9-4be1-47e3-9588-e0747410912c-catalog-content\") pod \"redhat-operators-smtml\" (UID: \"b24525d9-4be1-47e3-9588-e0747410912c\") " pod="openshift-marketplace/redhat-operators-smtml" Mar 01 10:05:00 crc kubenswrapper[4792]: I0301 10:05:00.392279 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b24525d9-4be1-47e3-9588-e0747410912c-utilities\") pod \"redhat-operators-smtml\" (UID: \"b24525d9-4be1-47e3-9588-e0747410912c\") " pod="openshift-marketplace/redhat-operators-smtml" Mar 01 10:05:00 crc kubenswrapper[4792]: I0301 10:05:00.392365 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b24525d9-4be1-47e3-9588-e0747410912c-catalog-content\") pod \"redhat-operators-smtml\" (UID: \"b24525d9-4be1-47e3-9588-e0747410912c\") " pod="openshift-marketplace/redhat-operators-smtml" Mar 01 10:05:00 crc kubenswrapper[4792]: I0301 10:05:00.392496 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn2lb\" (UniqueName: \"kubernetes.io/projected/b24525d9-4be1-47e3-9588-e0747410912c-kube-api-access-gn2lb\") pod \"redhat-operators-smtml\" (UID: \"b24525d9-4be1-47e3-9588-e0747410912c\") " pod="openshift-marketplace/redhat-operators-smtml" Mar 01 10:05:00 crc kubenswrapper[4792]: I0301 10:05:00.392889 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b24525d9-4be1-47e3-9588-e0747410912c-utilities\") pod \"redhat-operators-smtml\" (UID: \"b24525d9-4be1-47e3-9588-e0747410912c\") " pod="openshift-marketplace/redhat-operators-smtml" Mar 01 10:05:00 crc kubenswrapper[4792]: I0301 10:05:00.393049 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b24525d9-4be1-47e3-9588-e0747410912c-catalog-content\") pod \"redhat-operators-smtml\" (UID: \"b24525d9-4be1-47e3-9588-e0747410912c\") " pod="openshift-marketplace/redhat-operators-smtml" Mar 01 10:05:00 crc kubenswrapper[4792]: I0301 10:05:00.415848 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn2lb\" (UniqueName: \"kubernetes.io/projected/b24525d9-4be1-47e3-9588-e0747410912c-kube-api-access-gn2lb\") pod \"redhat-operators-smtml\" (UID: \"b24525d9-4be1-47e3-9588-e0747410912c\") " pod="openshift-marketplace/redhat-operators-smtml" Mar 01 10:05:00 crc kubenswrapper[4792]: I0301 10:05:00.427815 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smtml" Mar 01 10:05:00 crc kubenswrapper[4792]: I0301 10:05:00.880201 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-smtml"] Mar 01 10:05:01 crc kubenswrapper[4792]: I0301 10:05:01.547141 4792 generic.go:334] "Generic (PLEG): container finished" podID="b24525d9-4be1-47e3-9588-e0747410912c" containerID="578e35be83b3d72216917a0e56b3301462f3ee60a7f7deba8b5ade43a2c43a92" exitCode=0 Mar 01 10:05:01 crc kubenswrapper[4792]: I0301 10:05:01.547400 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smtml" event={"ID":"b24525d9-4be1-47e3-9588-e0747410912c","Type":"ContainerDied","Data":"578e35be83b3d72216917a0e56b3301462f3ee60a7f7deba8b5ade43a2c43a92"} Mar 01 10:05:01 crc kubenswrapper[4792]: I0301 10:05:01.548064 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smtml" event={"ID":"b24525d9-4be1-47e3-9588-e0747410912c","Type":"ContainerStarted","Data":"6dea6bb4d8bc1761ebada466cb6f422a987107e9b6cee2ade5e5ce671abc23fd"} Mar 01 10:05:02 crc kubenswrapper[4792]: I0301 10:05:02.557195 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smtml" event={"ID":"b24525d9-4be1-47e3-9588-e0747410912c","Type":"ContainerStarted","Data":"b6e10ea71933446956957d2f5deb2f862f01ab2963ae0eec5c79b22f02f940b1"} Mar 01 10:05:07 crc kubenswrapper[4792]: I0301 10:05:07.599253 4792 generic.go:334] "Generic (PLEG): container finished" podID="b24525d9-4be1-47e3-9588-e0747410912c" containerID="b6e10ea71933446956957d2f5deb2f862f01ab2963ae0eec5c79b22f02f940b1" exitCode=0 Mar 01 10:05:07 crc kubenswrapper[4792]: I0301 10:05:07.599331 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smtml" event={"ID":"b24525d9-4be1-47e3-9588-e0747410912c","Type":"ContainerDied","Data":"b6e10ea71933446956957d2f5deb2f862f01ab2963ae0eec5c79b22f02f940b1"} Mar 01 10:05:08 crc kubenswrapper[4792]: I0301 10:05:08.409727 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:05:08 crc kubenswrapper[4792]: E0301 10:05:08.411086 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:05:08 crc kubenswrapper[4792]: I0301 10:05:08.624315 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smtml" event={"ID":"b24525d9-4be1-47e3-9588-e0747410912c","Type":"ContainerStarted","Data":"8b0cfdb40aa117b03ff8e1e7997effe581bdac2ed23c8e645490c3672c724a55"} Mar 01 10:05:08 crc kubenswrapper[4792]: I0301 10:05:08.650808 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-smtml" podStartSLOduration=2.165407095 podStartE2EDuration="8.650784495s" podCreationTimestamp="2026-03-01 10:05:00 +0000 UTC" firstStartedPulling="2026-03-01 10:05:01.549994467 +0000 UTC m=+3430.791873664" lastFinishedPulling="2026-03-01 10:05:08.035371837 +0000 UTC m=+3437.277251064" observedRunningTime="2026-03-01 10:05:08.644594971 +0000 UTC m=+3437.886474178" watchObservedRunningTime="2026-03-01 10:05:08.650784495 +0000 UTC m=+3437.892663702" Mar 01 10:05:10 crc kubenswrapper[4792]: I0301 10:05:10.428892 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-smtml" Mar 01 10:05:10 crc kubenswrapper[4792]: I0301 10:05:10.429323 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-smtml" Mar 01 10:05:11 crc kubenswrapper[4792]: I0301 10:05:11.480178 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-smtml" podUID="b24525d9-4be1-47e3-9588-e0747410912c" containerName="registry-server" probeResult="failure" output=< Mar 01 10:05:11 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 10:05:11 crc kubenswrapper[4792]: > Mar 01 10:05:21 crc kubenswrapper[4792]: I0301 10:05:21.475547 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-smtml" podUID="b24525d9-4be1-47e3-9588-e0747410912c" containerName="registry-server" probeResult="failure" output=< Mar 01 10:05:21 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 10:05:21 crc kubenswrapper[4792]: > Mar 01 10:05:22 crc kubenswrapper[4792]: I0301 10:05:22.408646 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:05:22 crc kubenswrapper[4792]: E0301 10:05:22.409242 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:05:31 crc kubenswrapper[4792]: I0301 10:05:31.472725 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-smtml" podUID="b24525d9-4be1-47e3-9588-e0747410912c" containerName="registry-server" probeResult="failure" output=< Mar 01 10:05:31 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 10:05:31 crc kubenswrapper[4792]: > Mar 01 10:05:36 crc kubenswrapper[4792]: I0301 10:05:36.409197 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:05:36 crc kubenswrapper[4792]: E0301 10:05:36.410683 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:05:41 crc kubenswrapper[4792]: I0301 10:05:41.469815 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-smtml" podUID="b24525d9-4be1-47e3-9588-e0747410912c" containerName="registry-server" probeResult="failure" output=< Mar 01 10:05:41 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 10:05:41 crc kubenswrapper[4792]: > Mar 01 10:05:50 crc kubenswrapper[4792]: I0301 10:05:50.409111 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:05:50 crc kubenswrapper[4792]: E0301 10:05:50.409947 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:05:50 crc kubenswrapper[4792]: I0301 10:05:50.478183 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-smtml" Mar 01 10:05:50 crc kubenswrapper[4792]: I0301 10:05:50.533756 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-smtml" Mar 01 10:05:50 crc kubenswrapper[4792]: I0301 10:05:50.712549 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-smtml"] Mar 01 10:05:52 crc kubenswrapper[4792]: I0301 10:05:52.002629 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-smtml" podUID="b24525d9-4be1-47e3-9588-e0747410912c" containerName="registry-server" containerID="cri-o://8b0cfdb40aa117b03ff8e1e7997effe581bdac2ed23c8e645490c3672c724a55" gracePeriod=2 Mar 01 10:05:52 crc kubenswrapper[4792]: I0301 10:05:52.561459 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smtml" Mar 01 10:05:52 crc kubenswrapper[4792]: I0301 10:05:52.619840 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gn2lb\" (UniqueName: \"kubernetes.io/projected/b24525d9-4be1-47e3-9588-e0747410912c-kube-api-access-gn2lb\") pod \"b24525d9-4be1-47e3-9588-e0747410912c\" (UID: \"b24525d9-4be1-47e3-9588-e0747410912c\") " Mar 01 10:05:52 crc kubenswrapper[4792]: I0301 10:05:52.620090 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b24525d9-4be1-47e3-9588-e0747410912c-utilities\") pod \"b24525d9-4be1-47e3-9588-e0747410912c\" (UID: \"b24525d9-4be1-47e3-9588-e0747410912c\") " Mar 01 10:05:52 crc kubenswrapper[4792]: I0301 10:05:52.620430 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b24525d9-4be1-47e3-9588-e0747410912c-catalog-content\") pod \"b24525d9-4be1-47e3-9588-e0747410912c\" (UID: \"b24525d9-4be1-47e3-9588-e0747410912c\") " Mar 01 10:05:52 crc kubenswrapper[4792]: I0301 10:05:52.621426 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b24525d9-4be1-47e3-9588-e0747410912c-utilities" (OuterVolumeSpecName: "utilities") pod "b24525d9-4be1-47e3-9588-e0747410912c" (UID: "b24525d9-4be1-47e3-9588-e0747410912c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:05:52 crc kubenswrapper[4792]: I0301 10:05:52.627811 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b24525d9-4be1-47e3-9588-e0747410912c-kube-api-access-gn2lb" (OuterVolumeSpecName: "kube-api-access-gn2lb") pod "b24525d9-4be1-47e3-9588-e0747410912c" (UID: "b24525d9-4be1-47e3-9588-e0747410912c"). InnerVolumeSpecName "kube-api-access-gn2lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:05:52 crc kubenswrapper[4792]: I0301 10:05:52.723112 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b24525d9-4be1-47e3-9588-e0747410912c-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 10:05:52 crc kubenswrapper[4792]: I0301 10:05:52.723147 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gn2lb\" (UniqueName: \"kubernetes.io/projected/b24525d9-4be1-47e3-9588-e0747410912c-kube-api-access-gn2lb\") on node \"crc\" DevicePath \"\"" Mar 01 10:05:52 crc kubenswrapper[4792]: I0301 10:05:52.793042 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b24525d9-4be1-47e3-9588-e0747410912c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b24525d9-4be1-47e3-9588-e0747410912c" (UID: "b24525d9-4be1-47e3-9588-e0747410912c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:05:52 crc kubenswrapper[4792]: I0301 10:05:52.825620 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b24525d9-4be1-47e3-9588-e0747410912c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 10:05:53 crc kubenswrapper[4792]: I0301 10:05:53.014323 4792 generic.go:334] "Generic (PLEG): container finished" podID="b24525d9-4be1-47e3-9588-e0747410912c" containerID="8b0cfdb40aa117b03ff8e1e7997effe581bdac2ed23c8e645490c3672c724a55" exitCode=0 Mar 01 10:05:53 crc kubenswrapper[4792]: I0301 10:05:53.014465 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smtml" event={"ID":"b24525d9-4be1-47e3-9588-e0747410912c","Type":"ContainerDied","Data":"8b0cfdb40aa117b03ff8e1e7997effe581bdac2ed23c8e645490c3672c724a55"} Mar 01 10:05:53 crc kubenswrapper[4792]: I0301 10:05:53.014739 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smtml" event={"ID":"b24525d9-4be1-47e3-9588-e0747410912c","Type":"ContainerDied","Data":"6dea6bb4d8bc1761ebada466cb6f422a987107e9b6cee2ade5e5ce671abc23fd"} Mar 01 10:05:53 crc kubenswrapper[4792]: I0301 10:05:53.014769 4792 scope.go:117] "RemoveContainer" containerID="8b0cfdb40aa117b03ff8e1e7997effe581bdac2ed23c8e645490c3672c724a55" Mar 01 10:05:53 crc kubenswrapper[4792]: I0301 10:05:53.014547 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smtml" Mar 01 10:05:53 crc kubenswrapper[4792]: I0301 10:05:53.040038 4792 scope.go:117] "RemoveContainer" containerID="b6e10ea71933446956957d2f5deb2f862f01ab2963ae0eec5c79b22f02f940b1" Mar 01 10:05:53 crc kubenswrapper[4792]: I0301 10:05:53.067157 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-smtml"] Mar 01 10:05:53 crc kubenswrapper[4792]: I0301 10:05:53.070332 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-smtml"] Mar 01 10:05:53 crc kubenswrapper[4792]: I0301 10:05:53.074165 4792 scope.go:117] "RemoveContainer" containerID="578e35be83b3d72216917a0e56b3301462f3ee60a7f7deba8b5ade43a2c43a92" Mar 01 10:05:53 crc kubenswrapper[4792]: I0301 10:05:53.144109 4792 scope.go:117] "RemoveContainer" containerID="8b0cfdb40aa117b03ff8e1e7997effe581bdac2ed23c8e645490c3672c724a55" Mar 01 10:05:53 crc kubenswrapper[4792]: E0301 10:05:53.146304 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b0cfdb40aa117b03ff8e1e7997effe581bdac2ed23c8e645490c3672c724a55\": container with ID starting with 8b0cfdb40aa117b03ff8e1e7997effe581bdac2ed23c8e645490c3672c724a55 not found: ID does not exist" containerID="8b0cfdb40aa117b03ff8e1e7997effe581bdac2ed23c8e645490c3672c724a55" Mar 01 10:05:53 crc kubenswrapper[4792]: I0301 10:05:53.146347 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b0cfdb40aa117b03ff8e1e7997effe581bdac2ed23c8e645490c3672c724a55"} err="failed to get container status \"8b0cfdb40aa117b03ff8e1e7997effe581bdac2ed23c8e645490c3672c724a55\": rpc error: code = NotFound desc = could not find container \"8b0cfdb40aa117b03ff8e1e7997effe581bdac2ed23c8e645490c3672c724a55\": container with ID starting with 8b0cfdb40aa117b03ff8e1e7997effe581bdac2ed23c8e645490c3672c724a55 not found: ID does not exist" Mar 01 10:05:53 crc kubenswrapper[4792]: I0301 10:05:53.146374 4792 scope.go:117] "RemoveContainer" containerID="b6e10ea71933446956957d2f5deb2f862f01ab2963ae0eec5c79b22f02f940b1" Mar 01 10:05:53 crc kubenswrapper[4792]: E0301 10:05:53.147472 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6e10ea71933446956957d2f5deb2f862f01ab2963ae0eec5c79b22f02f940b1\": container with ID starting with b6e10ea71933446956957d2f5deb2f862f01ab2963ae0eec5c79b22f02f940b1 not found: ID does not exist" containerID="b6e10ea71933446956957d2f5deb2f862f01ab2963ae0eec5c79b22f02f940b1" Mar 01 10:05:53 crc kubenswrapper[4792]: I0301 10:05:53.147529 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6e10ea71933446956957d2f5deb2f862f01ab2963ae0eec5c79b22f02f940b1"} err="failed to get container status \"b6e10ea71933446956957d2f5deb2f862f01ab2963ae0eec5c79b22f02f940b1\": rpc error: code = NotFound desc = could not find container \"b6e10ea71933446956957d2f5deb2f862f01ab2963ae0eec5c79b22f02f940b1\": container with ID starting with b6e10ea71933446956957d2f5deb2f862f01ab2963ae0eec5c79b22f02f940b1 not found: ID does not exist" Mar 01 10:05:53 crc kubenswrapper[4792]: I0301 10:05:53.147559 4792 scope.go:117] "RemoveContainer" containerID="578e35be83b3d72216917a0e56b3301462f3ee60a7f7deba8b5ade43a2c43a92" Mar 01 10:05:53 crc kubenswrapper[4792]: E0301 10:05:53.148359 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"578e35be83b3d72216917a0e56b3301462f3ee60a7f7deba8b5ade43a2c43a92\": container with ID starting with 578e35be83b3d72216917a0e56b3301462f3ee60a7f7deba8b5ade43a2c43a92 not found: ID does not exist" containerID="578e35be83b3d72216917a0e56b3301462f3ee60a7f7deba8b5ade43a2c43a92" Mar 01 10:05:53 crc kubenswrapper[4792]: I0301 10:05:53.148386 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"578e35be83b3d72216917a0e56b3301462f3ee60a7f7deba8b5ade43a2c43a92"} err="failed to get container status \"578e35be83b3d72216917a0e56b3301462f3ee60a7f7deba8b5ade43a2c43a92\": rpc error: code = NotFound desc = could not find container \"578e35be83b3d72216917a0e56b3301462f3ee60a7f7deba8b5ade43a2c43a92\": container with ID starting with 578e35be83b3d72216917a0e56b3301462f3ee60a7f7deba8b5ade43a2c43a92 not found: ID does not exist" Mar 01 10:05:53 crc kubenswrapper[4792]: I0301 10:05:53.423890 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b24525d9-4be1-47e3-9588-e0747410912c" path="/var/lib/kubelet/pods/b24525d9-4be1-47e3-9588-e0747410912c/volumes" Mar 01 10:06:00 crc kubenswrapper[4792]: I0301 10:06:00.147880 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539326-7hz59"] Mar 01 10:06:00 crc kubenswrapper[4792]: E0301 10:06:00.148766 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b24525d9-4be1-47e3-9588-e0747410912c" containerName="extract-content" Mar 01 10:06:00 crc kubenswrapper[4792]: I0301 10:06:00.148778 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b24525d9-4be1-47e3-9588-e0747410912c" containerName="extract-content" Mar 01 10:06:00 crc kubenswrapper[4792]: E0301 10:06:00.148805 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b24525d9-4be1-47e3-9588-e0747410912c" containerName="registry-server" Mar 01 10:06:00 crc kubenswrapper[4792]: I0301 10:06:00.148810 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b24525d9-4be1-47e3-9588-e0747410912c" containerName="registry-server" Mar 01 10:06:00 crc kubenswrapper[4792]: E0301 10:06:00.148833 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b24525d9-4be1-47e3-9588-e0747410912c" containerName="extract-utilities" Mar 01 10:06:00 crc kubenswrapper[4792]: I0301 10:06:00.148839 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b24525d9-4be1-47e3-9588-e0747410912c" containerName="extract-utilities" Mar 01 10:06:00 crc kubenswrapper[4792]: I0301 10:06:00.149036 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b24525d9-4be1-47e3-9588-e0747410912c" containerName="registry-server" Mar 01 10:06:00 crc kubenswrapper[4792]: I0301 10:06:00.149660 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539326-7hz59" Mar 01 10:06:00 crc kubenswrapper[4792]: I0301 10:06:00.152429 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:06:00 crc kubenswrapper[4792]: I0301 10:06:00.152616 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:06:00 crc kubenswrapper[4792]: I0301 10:06:00.155240 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:06:00 crc kubenswrapper[4792]: I0301 10:06:00.157692 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539326-7hz59"] Mar 01 10:06:00 crc kubenswrapper[4792]: I0301 10:06:00.261020 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fk2t\" (UniqueName: \"kubernetes.io/projected/e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8-kube-api-access-5fk2t\") pod \"auto-csr-approver-29539326-7hz59\" (UID: \"e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8\") " pod="openshift-infra/auto-csr-approver-29539326-7hz59" Mar 01 10:06:00 crc kubenswrapper[4792]: I0301 10:06:00.362805 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fk2t\" (UniqueName: \"kubernetes.io/projected/e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8-kube-api-access-5fk2t\") pod \"auto-csr-approver-29539326-7hz59\" (UID: \"e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8\") " pod="openshift-infra/auto-csr-approver-29539326-7hz59" Mar 01 10:06:00 crc kubenswrapper[4792]: I0301 10:06:00.407097 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fk2t\" (UniqueName: \"kubernetes.io/projected/e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8-kube-api-access-5fk2t\") pod \"auto-csr-approver-29539326-7hz59\" (UID: \"e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8\") " pod="openshift-infra/auto-csr-approver-29539326-7hz59" Mar 01 10:06:00 crc kubenswrapper[4792]: I0301 10:06:00.498989 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539326-7hz59" Mar 01 10:06:00 crc kubenswrapper[4792]: I0301 10:06:00.982868 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539326-7hz59"] Mar 01 10:06:01 crc kubenswrapper[4792]: I0301 10:06:01.085342 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539326-7hz59" event={"ID":"e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8","Type":"ContainerStarted","Data":"10e06fef5e07f2fd1cf59f8e8f7565d20e61b5a4b1dd074783b869da3bce167a"} Mar 01 10:06:01 crc kubenswrapper[4792]: I0301 10:06:01.417963 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:06:01 crc kubenswrapper[4792]: E0301 10:06:01.418356 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:06:03 crc kubenswrapper[4792]: I0301 10:06:03.100679 4792 generic.go:334] "Generic (PLEG): container finished" podID="e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8" containerID="985260a2ba2153789f87b2fc888d57bf9b851f86fd15aaf0dca4797eeb86773f" exitCode=0 Mar 01 10:06:03 crc kubenswrapper[4792]: I0301 10:06:03.101047 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539326-7hz59" event={"ID":"e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8","Type":"ContainerDied","Data":"985260a2ba2153789f87b2fc888d57bf9b851f86fd15aaf0dca4797eeb86773f"} Mar 01 10:06:04 crc kubenswrapper[4792]: I0301 10:06:04.520211 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539326-7hz59" Mar 01 10:06:04 crc kubenswrapper[4792]: I0301 10:06:04.543374 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fk2t\" (UniqueName: \"kubernetes.io/projected/e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8-kube-api-access-5fk2t\") pod \"e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8\" (UID: \"e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8\") " Mar 01 10:06:04 crc kubenswrapper[4792]: I0301 10:06:04.594173 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8-kube-api-access-5fk2t" (OuterVolumeSpecName: "kube-api-access-5fk2t") pod "e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8" (UID: "e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8"). InnerVolumeSpecName "kube-api-access-5fk2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:06:04 crc kubenswrapper[4792]: I0301 10:06:04.646110 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fk2t\" (UniqueName: \"kubernetes.io/projected/e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8-kube-api-access-5fk2t\") on node \"crc\" DevicePath \"\"" Mar 01 10:06:05 crc kubenswrapper[4792]: I0301 10:06:05.117455 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539326-7hz59" event={"ID":"e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8","Type":"ContainerDied","Data":"10e06fef5e07f2fd1cf59f8e8f7565d20e61b5a4b1dd074783b869da3bce167a"} Mar 01 10:06:05 crc kubenswrapper[4792]: I0301 10:06:05.117749 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10e06fef5e07f2fd1cf59f8e8f7565d20e61b5a4b1dd074783b869da3bce167a" Mar 01 10:06:05 crc kubenswrapper[4792]: I0301 10:06:05.117488 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539326-7hz59" Mar 01 10:06:05 crc kubenswrapper[4792]: I0301 10:06:05.585326 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539320-bzqcr"] Mar 01 10:06:05 crc kubenswrapper[4792]: I0301 10:06:05.593294 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539320-bzqcr"] Mar 01 10:06:07 crc kubenswrapper[4792]: I0301 10:06:07.421671 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5c085ec-b23e-4ad9-ae76-9775921b667d" path="/var/lib/kubelet/pods/a5c085ec-b23e-4ad9-ae76-9775921b667d/volumes" Mar 01 10:06:13 crc kubenswrapper[4792]: I0301 10:06:13.408723 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:06:13 crc kubenswrapper[4792]: E0301 10:06:13.409641 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:06:27 crc kubenswrapper[4792]: I0301 10:06:27.414034 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:06:27 crc kubenswrapper[4792]: E0301 10:06:27.414677 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:06:32 crc kubenswrapper[4792]: I0301 10:06:32.527520 4792 scope.go:117] "RemoveContainer" containerID="4994cd62771f4057b6d1f58071d3828ce7f1350994c490b87bf0e5cb1e97dff9" Mar 01 10:06:39 crc kubenswrapper[4792]: I0301 10:06:39.409675 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:06:40 crc kubenswrapper[4792]: I0301 10:06:40.548851 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"947fc3b8131dfb1dafd7693a1530cb8db816d30e8340c8877587b0f93967d46d"} Mar 01 10:07:09 crc kubenswrapper[4792]: I0301 10:07:09.907563 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-48qp5"] Mar 01 10:07:09 crc kubenswrapper[4792]: E0301 10:07:09.908393 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8" containerName="oc" Mar 01 10:07:09 crc kubenswrapper[4792]: I0301 10:07:09.908406 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8" containerName="oc" Mar 01 10:07:09 crc kubenswrapper[4792]: I0301 10:07:09.908591 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8" containerName="oc" Mar 01 10:07:09 crc kubenswrapper[4792]: I0301 10:07:09.910570 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-48qp5" Mar 01 10:07:09 crc kubenswrapper[4792]: I0301 10:07:09.924884 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-48qp5"] Mar 01 10:07:10 crc kubenswrapper[4792]: I0301 10:07:10.089645 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvpl7\" (UniqueName: \"kubernetes.io/projected/682f4383-d3fb-4efe-89f5-e496b4b3b71b-kube-api-access-xvpl7\") pod \"community-operators-48qp5\" (UID: \"682f4383-d3fb-4efe-89f5-e496b4b3b71b\") " pod="openshift-marketplace/community-operators-48qp5" Mar 01 10:07:10 crc kubenswrapper[4792]: I0301 10:07:10.089711 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/682f4383-d3fb-4efe-89f5-e496b4b3b71b-catalog-content\") pod \"community-operators-48qp5\" (UID: \"682f4383-d3fb-4efe-89f5-e496b4b3b71b\") " pod="openshift-marketplace/community-operators-48qp5" Mar 01 10:07:10 crc kubenswrapper[4792]: I0301 10:07:10.089819 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/682f4383-d3fb-4efe-89f5-e496b4b3b71b-utilities\") pod \"community-operators-48qp5\" (UID: \"682f4383-d3fb-4efe-89f5-e496b4b3b71b\") " pod="openshift-marketplace/community-operators-48qp5" Mar 01 10:07:10 crc kubenswrapper[4792]: I0301 10:07:10.192006 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvpl7\" (UniqueName: \"kubernetes.io/projected/682f4383-d3fb-4efe-89f5-e496b4b3b71b-kube-api-access-xvpl7\") pod \"community-operators-48qp5\" (UID: \"682f4383-d3fb-4efe-89f5-e496b4b3b71b\") " pod="openshift-marketplace/community-operators-48qp5" Mar 01 10:07:10 crc kubenswrapper[4792]: I0301 10:07:10.192066 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/682f4383-d3fb-4efe-89f5-e496b4b3b71b-catalog-content\") pod \"community-operators-48qp5\" (UID: \"682f4383-d3fb-4efe-89f5-e496b4b3b71b\") " pod="openshift-marketplace/community-operators-48qp5" Mar 01 10:07:10 crc kubenswrapper[4792]: I0301 10:07:10.192196 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/682f4383-d3fb-4efe-89f5-e496b4b3b71b-utilities\") pod \"community-operators-48qp5\" (UID: \"682f4383-d3fb-4efe-89f5-e496b4b3b71b\") " pod="openshift-marketplace/community-operators-48qp5" Mar 01 10:07:10 crc kubenswrapper[4792]: I0301 10:07:10.192728 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/682f4383-d3fb-4efe-89f5-e496b4b3b71b-utilities\") pod \"community-operators-48qp5\" (UID: \"682f4383-d3fb-4efe-89f5-e496b4b3b71b\") " pod="openshift-marketplace/community-operators-48qp5" Mar 01 10:07:10 crc kubenswrapper[4792]: I0301 10:07:10.193012 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/682f4383-d3fb-4efe-89f5-e496b4b3b71b-catalog-content\") pod \"community-operators-48qp5\" (UID: \"682f4383-d3fb-4efe-89f5-e496b4b3b71b\") " pod="openshift-marketplace/community-operators-48qp5" Mar 01 10:07:10 crc kubenswrapper[4792]: I0301 10:07:10.218211 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvpl7\" (UniqueName: \"kubernetes.io/projected/682f4383-d3fb-4efe-89f5-e496b4b3b71b-kube-api-access-xvpl7\") pod \"community-operators-48qp5\" (UID: \"682f4383-d3fb-4efe-89f5-e496b4b3b71b\") " pod="openshift-marketplace/community-operators-48qp5" Mar 01 10:07:10 crc kubenswrapper[4792]: I0301 10:07:10.238424 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-48qp5" Mar 01 10:07:10 crc kubenswrapper[4792]: I0301 10:07:10.721144 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-48qp5"] Mar 01 10:07:10 crc kubenswrapper[4792]: I0301 10:07:10.793762 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-48qp5" event={"ID":"682f4383-d3fb-4efe-89f5-e496b4b3b71b","Type":"ContainerStarted","Data":"94eba7eaa028ed08ef077cfc6746bc96f898cc9f246702b2a5a689eae3288e03"} Mar 01 10:07:11 crc kubenswrapper[4792]: I0301 10:07:11.803581 4792 generic.go:334] "Generic (PLEG): container finished" podID="682f4383-d3fb-4efe-89f5-e496b4b3b71b" containerID="550d4cbb086003d878b362fc46f3972d5e8a3737e8bfdb31dd607b8a5db535b2" exitCode=0 Mar 01 10:07:11 crc kubenswrapper[4792]: I0301 10:07:11.803659 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-48qp5" event={"ID":"682f4383-d3fb-4efe-89f5-e496b4b3b71b","Type":"ContainerDied","Data":"550d4cbb086003d878b362fc46f3972d5e8a3737e8bfdb31dd607b8a5db535b2"} Mar 01 10:07:11 crc kubenswrapper[4792]: I0301 10:07:11.808226 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 01 10:07:12 crc kubenswrapper[4792]: I0301 10:07:12.813489 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-48qp5" event={"ID":"682f4383-d3fb-4efe-89f5-e496b4b3b71b","Type":"ContainerStarted","Data":"be7a136c39cab1fb688e8478bf8ada2e256e2ff41b168a20058a43540f860198"} Mar 01 10:07:14 crc kubenswrapper[4792]: I0301 10:07:14.830717 4792 generic.go:334] "Generic (PLEG): container finished" podID="682f4383-d3fb-4efe-89f5-e496b4b3b71b" containerID="be7a136c39cab1fb688e8478bf8ada2e256e2ff41b168a20058a43540f860198" exitCode=0 Mar 01 10:07:14 crc kubenswrapper[4792]: I0301 10:07:14.830774 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-48qp5" event={"ID":"682f4383-d3fb-4efe-89f5-e496b4b3b71b","Type":"ContainerDied","Data":"be7a136c39cab1fb688e8478bf8ada2e256e2ff41b168a20058a43540f860198"} Mar 01 10:07:15 crc kubenswrapper[4792]: I0301 10:07:15.841181 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-48qp5" event={"ID":"682f4383-d3fb-4efe-89f5-e496b4b3b71b","Type":"ContainerStarted","Data":"539ab3bdb0f8e33c45ad9a4e194bc21761a2d32030aa18bd3fd289463fab5c69"} Mar 01 10:07:20 crc kubenswrapper[4792]: I0301 10:07:20.021636 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-48qp5" podStartSLOduration=7.6135256909999995 podStartE2EDuration="11.021616673s" podCreationTimestamp="2026-03-01 10:07:09 +0000 UTC" firstStartedPulling="2026-03-01 10:07:11.808010779 +0000 UTC m=+3561.049889976" lastFinishedPulling="2026-03-01 10:07:15.216101761 +0000 UTC m=+3564.457980958" observedRunningTime="2026-03-01 10:07:15.86283305 +0000 UTC m=+3565.104712257" watchObservedRunningTime="2026-03-01 10:07:20.021616673 +0000 UTC m=+3569.263495870" Mar 01 10:07:20 crc kubenswrapper[4792]: I0301 10:07:20.030786 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pph6g"] Mar 01 10:07:20 crc kubenswrapper[4792]: I0301 10:07:20.032680 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pph6g" Mar 01 10:07:20 crc kubenswrapper[4792]: I0301 10:07:20.056114 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pph6g"] Mar 01 10:07:20 crc kubenswrapper[4792]: I0301 10:07:20.216814 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef-catalog-content\") pod \"certified-operators-pph6g\" (UID: \"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef\") " pod="openshift-marketplace/certified-operators-pph6g" Mar 01 10:07:20 crc kubenswrapper[4792]: I0301 10:07:20.216931 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch59q\" (UniqueName: \"kubernetes.io/projected/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef-kube-api-access-ch59q\") pod \"certified-operators-pph6g\" (UID: \"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef\") " pod="openshift-marketplace/certified-operators-pph6g" Mar 01 10:07:20 crc kubenswrapper[4792]: I0301 10:07:20.216994 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef-utilities\") pod \"certified-operators-pph6g\" (UID: \"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef\") " pod="openshift-marketplace/certified-operators-pph6g" Mar 01 10:07:20 crc kubenswrapper[4792]: I0301 10:07:20.239343 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-48qp5" Mar 01 10:07:20 crc kubenswrapper[4792]: I0301 10:07:20.239821 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-48qp5" Mar 01 10:07:20 crc kubenswrapper[4792]: I0301 10:07:20.319700 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef-catalog-content\") pod \"certified-operators-pph6g\" (UID: \"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef\") " pod="openshift-marketplace/certified-operators-pph6g" Mar 01 10:07:20 crc kubenswrapper[4792]: I0301 10:07:20.319754 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch59q\" (UniqueName: \"kubernetes.io/projected/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef-kube-api-access-ch59q\") pod \"certified-operators-pph6g\" (UID: \"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef\") " pod="openshift-marketplace/certified-operators-pph6g" Mar 01 10:07:20 crc kubenswrapper[4792]: I0301 10:07:20.319779 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef-utilities\") pod \"certified-operators-pph6g\" (UID: \"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef\") " pod="openshift-marketplace/certified-operators-pph6g" Mar 01 10:07:20 crc kubenswrapper[4792]: I0301 10:07:20.321274 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef-utilities\") pod \"certified-operators-pph6g\" (UID: \"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef\") " pod="openshift-marketplace/certified-operators-pph6g" Mar 01 10:07:20 crc kubenswrapper[4792]: I0301 10:07:20.321371 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef-catalog-content\") pod \"certified-operators-pph6g\" (UID: \"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef\") " pod="openshift-marketplace/certified-operators-pph6g" Mar 01 10:07:20 crc kubenswrapper[4792]: I0301 10:07:20.347648 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch59q\" (UniqueName: \"kubernetes.io/projected/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef-kube-api-access-ch59q\") pod \"certified-operators-pph6g\" (UID: \"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef\") " pod="openshift-marketplace/certified-operators-pph6g" Mar 01 10:07:20 crc kubenswrapper[4792]: I0301 10:07:20.358236 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pph6g" Mar 01 10:07:20 crc kubenswrapper[4792]: I0301 10:07:20.930548 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pph6g"] Mar 01 10:07:21 crc kubenswrapper[4792]: I0301 10:07:21.287869 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-48qp5" podUID="682f4383-d3fb-4efe-89f5-e496b4b3b71b" containerName="registry-server" probeResult="failure" output=< Mar 01 10:07:21 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 10:07:21 crc kubenswrapper[4792]: > Mar 01 10:07:21 crc kubenswrapper[4792]: I0301 10:07:21.904713 4792 generic.go:334] "Generic (PLEG): container finished" podID="8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef" containerID="bdb69b17e8cbcb56946bda3a57e20355267927603056c7235362b2f0563bbc74" exitCode=0 Mar 01 10:07:21 crc kubenswrapper[4792]: I0301 10:07:21.904769 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pph6g" event={"ID":"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef","Type":"ContainerDied","Data":"bdb69b17e8cbcb56946bda3a57e20355267927603056c7235362b2f0563bbc74"} Mar 01 10:07:21 crc kubenswrapper[4792]: I0301 10:07:21.904819 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pph6g" event={"ID":"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef","Type":"ContainerStarted","Data":"20eeae96dbcf40cf915287f68860afaea37e4de6c7d4d59e23e4ddf89a89c82c"} Mar 01 10:07:22 crc kubenswrapper[4792]: I0301 10:07:22.914494 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pph6g" event={"ID":"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef","Type":"ContainerStarted","Data":"e98ec606238ece22cf5bf54f0d9b14ebe2b48e8479d47b0d013ac5c426816a04"} Mar 01 10:07:25 crc kubenswrapper[4792]: I0301 10:07:25.029255 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ltk8j"] Mar 01 10:07:25 crc kubenswrapper[4792]: I0301 10:07:25.033969 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ltk8j" Mar 01 10:07:25 crc kubenswrapper[4792]: I0301 10:07:25.058886 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ltk8j"] Mar 01 10:07:25 crc kubenswrapper[4792]: I0301 10:07:25.230926 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3512046c-8b15-4ab0-8c94-39d0d0f2e73c-utilities\") pod \"redhat-marketplace-ltk8j\" (UID: \"3512046c-8b15-4ab0-8c94-39d0d0f2e73c\") " pod="openshift-marketplace/redhat-marketplace-ltk8j" Mar 01 10:07:25 crc kubenswrapper[4792]: I0301 10:07:25.231420 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-974g4\" (UniqueName: \"kubernetes.io/projected/3512046c-8b15-4ab0-8c94-39d0d0f2e73c-kube-api-access-974g4\") pod \"redhat-marketplace-ltk8j\" (UID: \"3512046c-8b15-4ab0-8c94-39d0d0f2e73c\") " pod="openshift-marketplace/redhat-marketplace-ltk8j" Mar 01 10:07:25 crc kubenswrapper[4792]: I0301 10:07:25.231494 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3512046c-8b15-4ab0-8c94-39d0d0f2e73c-catalog-content\") pod \"redhat-marketplace-ltk8j\" (UID: \"3512046c-8b15-4ab0-8c94-39d0d0f2e73c\") " pod="openshift-marketplace/redhat-marketplace-ltk8j" Mar 01 10:07:25 crc kubenswrapper[4792]: I0301 10:07:25.334375 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3512046c-8b15-4ab0-8c94-39d0d0f2e73c-utilities\") pod \"redhat-marketplace-ltk8j\" (UID: \"3512046c-8b15-4ab0-8c94-39d0d0f2e73c\") " pod="openshift-marketplace/redhat-marketplace-ltk8j" Mar 01 10:07:25 crc kubenswrapper[4792]: I0301 10:07:25.334458 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-974g4\" (UniqueName: \"kubernetes.io/projected/3512046c-8b15-4ab0-8c94-39d0d0f2e73c-kube-api-access-974g4\") pod \"redhat-marketplace-ltk8j\" (UID: \"3512046c-8b15-4ab0-8c94-39d0d0f2e73c\") " pod="openshift-marketplace/redhat-marketplace-ltk8j" Mar 01 10:07:25 crc kubenswrapper[4792]: I0301 10:07:25.334529 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3512046c-8b15-4ab0-8c94-39d0d0f2e73c-catalog-content\") pod \"redhat-marketplace-ltk8j\" (UID: \"3512046c-8b15-4ab0-8c94-39d0d0f2e73c\") " pod="openshift-marketplace/redhat-marketplace-ltk8j" Mar 01 10:07:25 crc kubenswrapper[4792]: I0301 10:07:25.335105 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3512046c-8b15-4ab0-8c94-39d0d0f2e73c-utilities\") pod \"redhat-marketplace-ltk8j\" (UID: \"3512046c-8b15-4ab0-8c94-39d0d0f2e73c\") " pod="openshift-marketplace/redhat-marketplace-ltk8j" Mar 01 10:07:25 crc kubenswrapper[4792]: I0301 10:07:25.335143 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3512046c-8b15-4ab0-8c94-39d0d0f2e73c-catalog-content\") pod \"redhat-marketplace-ltk8j\" (UID: \"3512046c-8b15-4ab0-8c94-39d0d0f2e73c\") " pod="openshift-marketplace/redhat-marketplace-ltk8j" Mar 01 10:07:25 crc kubenswrapper[4792]: I0301 10:07:25.355934 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-974g4\" (UniqueName: \"kubernetes.io/projected/3512046c-8b15-4ab0-8c94-39d0d0f2e73c-kube-api-access-974g4\") pod \"redhat-marketplace-ltk8j\" (UID: \"3512046c-8b15-4ab0-8c94-39d0d0f2e73c\") " pod="openshift-marketplace/redhat-marketplace-ltk8j" Mar 01 10:07:25 crc kubenswrapper[4792]: I0301 10:07:25.363212 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ltk8j" Mar 01 10:07:25 crc kubenswrapper[4792]: I0301 10:07:25.858324 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ltk8j"] Mar 01 10:07:25 crc kubenswrapper[4792]: W0301 10:07:25.861995 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3512046c_8b15_4ab0_8c94_39d0d0f2e73c.slice/crio-7a85f6553115dd71b80a2c8f43818f4072ecec30d5eb3e65b978374e0bc127b2 WatchSource:0}: Error finding container 7a85f6553115dd71b80a2c8f43818f4072ecec30d5eb3e65b978374e0bc127b2: Status 404 returned error can't find the container with id 7a85f6553115dd71b80a2c8f43818f4072ecec30d5eb3e65b978374e0bc127b2 Mar 01 10:07:25 crc kubenswrapper[4792]: I0301 10:07:25.940318 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ltk8j" event={"ID":"3512046c-8b15-4ab0-8c94-39d0d0f2e73c","Type":"ContainerStarted","Data":"7a85f6553115dd71b80a2c8f43818f4072ecec30d5eb3e65b978374e0bc127b2"} Mar 01 10:07:25 crc kubenswrapper[4792]: I0301 10:07:25.943119 4792 generic.go:334] "Generic (PLEG): container finished" podID="8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef" containerID="e98ec606238ece22cf5bf54f0d9b14ebe2b48e8479d47b0d013ac5c426816a04" exitCode=0 Mar 01 10:07:25 crc kubenswrapper[4792]: I0301 10:07:25.943147 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pph6g" event={"ID":"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef","Type":"ContainerDied","Data":"e98ec606238ece22cf5bf54f0d9b14ebe2b48e8479d47b0d013ac5c426816a04"} Mar 01 10:07:26 crc kubenswrapper[4792]: I0301 10:07:26.954518 4792 generic.go:334] "Generic (PLEG): container finished" podID="3512046c-8b15-4ab0-8c94-39d0d0f2e73c" containerID="b863829c371b01a1972e917a6d1591c043a273261668f7d679b1616373a821a3" exitCode=0 Mar 01 10:07:26 crc kubenswrapper[4792]: I0301 10:07:26.954678 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ltk8j" event={"ID":"3512046c-8b15-4ab0-8c94-39d0d0f2e73c","Type":"ContainerDied","Data":"b863829c371b01a1972e917a6d1591c043a273261668f7d679b1616373a821a3"} Mar 01 10:07:26 crc kubenswrapper[4792]: I0301 10:07:26.958647 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pph6g" event={"ID":"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef","Type":"ContainerStarted","Data":"f2df48ec621324b00f73adb4b8eeec69ea7344052fadf1865efb512b64a14a49"} Mar 01 10:07:26 crc kubenswrapper[4792]: I0301 10:07:26.997366 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pph6g" podStartSLOduration=2.550711879 podStartE2EDuration="6.997346475s" podCreationTimestamp="2026-03-01 10:07:20 +0000 UTC" firstStartedPulling="2026-03-01 10:07:21.907263881 +0000 UTC m=+3571.149143068" lastFinishedPulling="2026-03-01 10:07:26.353898467 +0000 UTC m=+3575.595777664" observedRunningTime="2026-03-01 10:07:26.995744625 +0000 UTC m=+3576.237623822" watchObservedRunningTime="2026-03-01 10:07:26.997346475 +0000 UTC m=+3576.239225672" Mar 01 10:07:27 crc kubenswrapper[4792]: I0301 10:07:27.983837 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ltk8j" event={"ID":"3512046c-8b15-4ab0-8c94-39d0d0f2e73c","Type":"ContainerStarted","Data":"01a152634348be67850a0ff44b28b264a62b707c7ec30ef00d95e9f0a4621d55"} Mar 01 10:07:30 crc kubenswrapper[4792]: I0301 10:07:30.000554 4792 generic.go:334] "Generic (PLEG): container finished" podID="3512046c-8b15-4ab0-8c94-39d0d0f2e73c" containerID="01a152634348be67850a0ff44b28b264a62b707c7ec30ef00d95e9f0a4621d55" exitCode=0 Mar 01 10:07:30 crc kubenswrapper[4792]: I0301 10:07:30.000810 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ltk8j" event={"ID":"3512046c-8b15-4ab0-8c94-39d0d0f2e73c","Type":"ContainerDied","Data":"01a152634348be67850a0ff44b28b264a62b707c7ec30ef00d95e9f0a4621d55"} Mar 01 10:07:30 crc kubenswrapper[4792]: I0301 10:07:30.287857 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-48qp5" Mar 01 10:07:30 crc kubenswrapper[4792]: I0301 10:07:30.353201 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-48qp5" Mar 01 10:07:30 crc kubenswrapper[4792]: I0301 10:07:30.389697 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pph6g" Mar 01 10:07:30 crc kubenswrapper[4792]: I0301 10:07:30.389737 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pph6g" Mar 01 10:07:31 crc kubenswrapper[4792]: I0301 10:07:31.011268 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ltk8j" event={"ID":"3512046c-8b15-4ab0-8c94-39d0d0f2e73c","Type":"ContainerStarted","Data":"caf38347042ec7f347ab71967d0dc62fd40a1d00a3ad82bbffdf18b0ab66e178"} Mar 01 10:07:31 crc kubenswrapper[4792]: I0301 10:07:31.052834 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ltk8j" podStartSLOduration=3.5594541939999997 podStartE2EDuration="7.052813392s" podCreationTimestamp="2026-03-01 10:07:24 +0000 UTC" firstStartedPulling="2026-03-01 10:07:26.957699886 +0000 UTC m=+3576.199579083" lastFinishedPulling="2026-03-01 10:07:30.451059084 +0000 UTC m=+3579.692938281" observedRunningTime="2026-03-01 10:07:31.052043073 +0000 UTC m=+3580.293922270" watchObservedRunningTime="2026-03-01 10:07:31.052813392 +0000 UTC m=+3580.294692589" Mar 01 10:07:31 crc kubenswrapper[4792]: I0301 10:07:31.451505 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-pph6g" podUID="8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef" containerName="registry-server" probeResult="failure" output=< Mar 01 10:07:31 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 10:07:31 crc kubenswrapper[4792]: > Mar 01 10:07:32 crc kubenswrapper[4792]: I0301 10:07:32.593372 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-48qp5"] Mar 01 10:07:32 crc kubenswrapper[4792]: I0301 10:07:32.594260 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-48qp5" podUID="682f4383-d3fb-4efe-89f5-e496b4b3b71b" containerName="registry-server" containerID="cri-o://539ab3bdb0f8e33c45ad9a4e194bc21761a2d32030aa18bd3fd289463fab5c69" gracePeriod=2 Mar 01 10:07:33 crc kubenswrapper[4792]: I0301 10:07:33.063151 4792 generic.go:334] "Generic (PLEG): container finished" podID="682f4383-d3fb-4efe-89f5-e496b4b3b71b" containerID="539ab3bdb0f8e33c45ad9a4e194bc21761a2d32030aa18bd3fd289463fab5c69" exitCode=0 Mar 01 10:07:33 crc kubenswrapper[4792]: I0301 10:07:33.063445 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-48qp5" event={"ID":"682f4383-d3fb-4efe-89f5-e496b4b3b71b","Type":"ContainerDied","Data":"539ab3bdb0f8e33c45ad9a4e194bc21761a2d32030aa18bd3fd289463fab5c69"} Mar 01 10:07:33 crc kubenswrapper[4792]: I0301 10:07:33.175610 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-48qp5" Mar 01 10:07:33 crc kubenswrapper[4792]: I0301 10:07:33.319878 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvpl7\" (UniqueName: \"kubernetes.io/projected/682f4383-d3fb-4efe-89f5-e496b4b3b71b-kube-api-access-xvpl7\") pod \"682f4383-d3fb-4efe-89f5-e496b4b3b71b\" (UID: \"682f4383-d3fb-4efe-89f5-e496b4b3b71b\") " Mar 01 10:07:33 crc kubenswrapper[4792]: I0301 10:07:33.319988 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/682f4383-d3fb-4efe-89f5-e496b4b3b71b-catalog-content\") pod \"682f4383-d3fb-4efe-89f5-e496b4b3b71b\" (UID: \"682f4383-d3fb-4efe-89f5-e496b4b3b71b\") " Mar 01 10:07:33 crc kubenswrapper[4792]: I0301 10:07:33.320054 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/682f4383-d3fb-4efe-89f5-e496b4b3b71b-utilities\") pod \"682f4383-d3fb-4efe-89f5-e496b4b3b71b\" (UID: \"682f4383-d3fb-4efe-89f5-e496b4b3b71b\") " Mar 01 10:07:33 crc kubenswrapper[4792]: I0301 10:07:33.320837 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/682f4383-d3fb-4efe-89f5-e496b4b3b71b-utilities" (OuterVolumeSpecName: "utilities") pod "682f4383-d3fb-4efe-89f5-e496b4b3b71b" (UID: "682f4383-d3fb-4efe-89f5-e496b4b3b71b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:07:33 crc kubenswrapper[4792]: I0301 10:07:33.328102 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/682f4383-d3fb-4efe-89f5-e496b4b3b71b-kube-api-access-xvpl7" (OuterVolumeSpecName: "kube-api-access-xvpl7") pod "682f4383-d3fb-4efe-89f5-e496b4b3b71b" (UID: "682f4383-d3fb-4efe-89f5-e496b4b3b71b"). InnerVolumeSpecName "kube-api-access-xvpl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:07:33 crc kubenswrapper[4792]: I0301 10:07:33.368059 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/682f4383-d3fb-4efe-89f5-e496b4b3b71b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "682f4383-d3fb-4efe-89f5-e496b4b3b71b" (UID: "682f4383-d3fb-4efe-89f5-e496b4b3b71b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:07:33 crc kubenswrapper[4792]: I0301 10:07:33.422861 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvpl7\" (UniqueName: \"kubernetes.io/projected/682f4383-d3fb-4efe-89f5-e496b4b3b71b-kube-api-access-xvpl7\") on node \"crc\" DevicePath \"\"" Mar 01 10:07:33 crc kubenswrapper[4792]: I0301 10:07:33.423156 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/682f4383-d3fb-4efe-89f5-e496b4b3b71b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 10:07:33 crc kubenswrapper[4792]: I0301 10:07:33.423230 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/682f4383-d3fb-4efe-89f5-e496b4b3b71b-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 10:07:34 crc kubenswrapper[4792]: I0301 10:07:34.074300 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-48qp5" event={"ID":"682f4383-d3fb-4efe-89f5-e496b4b3b71b","Type":"ContainerDied","Data":"94eba7eaa028ed08ef077cfc6746bc96f898cc9f246702b2a5a689eae3288e03"} Mar 01 10:07:34 crc kubenswrapper[4792]: I0301 10:07:34.074355 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-48qp5" Mar 01 10:07:34 crc kubenswrapper[4792]: I0301 10:07:34.074638 4792 scope.go:117] "RemoveContainer" containerID="539ab3bdb0f8e33c45ad9a4e194bc21761a2d32030aa18bd3fd289463fab5c69" Mar 01 10:07:34 crc kubenswrapper[4792]: I0301 10:07:34.095491 4792 scope.go:117] "RemoveContainer" containerID="be7a136c39cab1fb688e8478bf8ada2e256e2ff41b168a20058a43540f860198" Mar 01 10:07:34 crc kubenswrapper[4792]: I0301 10:07:34.110152 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-48qp5"] Mar 01 10:07:34 crc kubenswrapper[4792]: I0301 10:07:34.119706 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-48qp5"] Mar 01 10:07:34 crc kubenswrapper[4792]: I0301 10:07:34.158275 4792 scope.go:117] "RemoveContainer" containerID="550d4cbb086003d878b362fc46f3972d5e8a3737e8bfdb31dd607b8a5db535b2" Mar 01 10:07:35 crc kubenswrapper[4792]: I0301 10:07:35.364637 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ltk8j" Mar 01 10:07:35 crc kubenswrapper[4792]: I0301 10:07:35.366315 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ltk8j" Mar 01 10:07:35 crc kubenswrapper[4792]: I0301 10:07:35.420486 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="682f4383-d3fb-4efe-89f5-e496b4b3b71b" path="/var/lib/kubelet/pods/682f4383-d3fb-4efe-89f5-e496b4b3b71b/volumes" Mar 01 10:07:36 crc kubenswrapper[4792]: I0301 10:07:36.411400 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-ltk8j" podUID="3512046c-8b15-4ab0-8c94-39d0d0f2e73c" containerName="registry-server" probeResult="failure" output=< Mar 01 10:07:36 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 10:07:36 crc kubenswrapper[4792]: > Mar 01 10:07:41 crc kubenswrapper[4792]: I0301 10:07:41.437247 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-pph6g" podUID="8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef" containerName="registry-server" probeResult="failure" output=< Mar 01 10:07:41 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 10:07:41 crc kubenswrapper[4792]: > Mar 01 10:07:45 crc kubenswrapper[4792]: I0301 10:07:45.440836 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ltk8j" Mar 01 10:07:45 crc kubenswrapper[4792]: I0301 10:07:45.500977 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ltk8j" Mar 01 10:07:47 crc kubenswrapper[4792]: I0301 10:07:47.173502 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ltk8j"] Mar 01 10:07:47 crc kubenswrapper[4792]: I0301 10:07:47.221205 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ltk8j" podUID="3512046c-8b15-4ab0-8c94-39d0d0f2e73c" containerName="registry-server" containerID="cri-o://caf38347042ec7f347ab71967d0dc62fd40a1d00a3ad82bbffdf18b0ab66e178" gracePeriod=2 Mar 01 10:07:47 crc kubenswrapper[4792]: I0301 10:07:47.862680 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ltk8j" Mar 01 10:07:47 crc kubenswrapper[4792]: I0301 10:07:47.901392 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3512046c-8b15-4ab0-8c94-39d0d0f2e73c-utilities\") pod \"3512046c-8b15-4ab0-8c94-39d0d0f2e73c\" (UID: \"3512046c-8b15-4ab0-8c94-39d0d0f2e73c\") " Mar 01 10:07:47 crc kubenswrapper[4792]: I0301 10:07:47.901453 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-974g4\" (UniqueName: \"kubernetes.io/projected/3512046c-8b15-4ab0-8c94-39d0d0f2e73c-kube-api-access-974g4\") pod \"3512046c-8b15-4ab0-8c94-39d0d0f2e73c\" (UID: \"3512046c-8b15-4ab0-8c94-39d0d0f2e73c\") " Mar 01 10:07:47 crc kubenswrapper[4792]: I0301 10:07:47.901620 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3512046c-8b15-4ab0-8c94-39d0d0f2e73c-catalog-content\") pod \"3512046c-8b15-4ab0-8c94-39d0d0f2e73c\" (UID: \"3512046c-8b15-4ab0-8c94-39d0d0f2e73c\") " Mar 01 10:07:47 crc kubenswrapper[4792]: I0301 10:07:47.902305 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3512046c-8b15-4ab0-8c94-39d0d0f2e73c-utilities" (OuterVolumeSpecName: "utilities") pod "3512046c-8b15-4ab0-8c94-39d0d0f2e73c" (UID: "3512046c-8b15-4ab0-8c94-39d0d0f2e73c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:07:47 crc kubenswrapper[4792]: I0301 10:07:47.910856 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3512046c-8b15-4ab0-8c94-39d0d0f2e73c-kube-api-access-974g4" (OuterVolumeSpecName: "kube-api-access-974g4") pod "3512046c-8b15-4ab0-8c94-39d0d0f2e73c" (UID: "3512046c-8b15-4ab0-8c94-39d0d0f2e73c"). InnerVolumeSpecName "kube-api-access-974g4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:07:47 crc kubenswrapper[4792]: I0301 10:07:47.929361 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3512046c-8b15-4ab0-8c94-39d0d0f2e73c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3512046c-8b15-4ab0-8c94-39d0d0f2e73c" (UID: "3512046c-8b15-4ab0-8c94-39d0d0f2e73c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:07:48 crc kubenswrapper[4792]: I0301 10:07:48.003735 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3512046c-8b15-4ab0-8c94-39d0d0f2e73c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 10:07:48 crc kubenswrapper[4792]: I0301 10:07:48.003778 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3512046c-8b15-4ab0-8c94-39d0d0f2e73c-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 10:07:48 crc kubenswrapper[4792]: I0301 10:07:48.003791 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-974g4\" (UniqueName: \"kubernetes.io/projected/3512046c-8b15-4ab0-8c94-39d0d0f2e73c-kube-api-access-974g4\") on node \"crc\" DevicePath \"\"" Mar 01 10:07:48 crc kubenswrapper[4792]: I0301 10:07:48.230645 4792 generic.go:334] "Generic (PLEG): container finished" podID="3512046c-8b15-4ab0-8c94-39d0d0f2e73c" containerID="caf38347042ec7f347ab71967d0dc62fd40a1d00a3ad82bbffdf18b0ab66e178" exitCode=0 Mar 01 10:07:48 crc kubenswrapper[4792]: I0301 10:07:48.230712 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ltk8j" Mar 01 10:07:48 crc kubenswrapper[4792]: I0301 10:07:48.230713 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ltk8j" event={"ID":"3512046c-8b15-4ab0-8c94-39d0d0f2e73c","Type":"ContainerDied","Data":"caf38347042ec7f347ab71967d0dc62fd40a1d00a3ad82bbffdf18b0ab66e178"} Mar 01 10:07:48 crc kubenswrapper[4792]: I0301 10:07:48.231697 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ltk8j" event={"ID":"3512046c-8b15-4ab0-8c94-39d0d0f2e73c","Type":"ContainerDied","Data":"7a85f6553115dd71b80a2c8f43818f4072ecec30d5eb3e65b978374e0bc127b2"} Mar 01 10:07:48 crc kubenswrapper[4792]: I0301 10:07:48.231729 4792 scope.go:117] "RemoveContainer" containerID="caf38347042ec7f347ab71967d0dc62fd40a1d00a3ad82bbffdf18b0ab66e178" Mar 01 10:07:48 crc kubenswrapper[4792]: I0301 10:07:48.273556 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ltk8j"] Mar 01 10:07:48 crc kubenswrapper[4792]: I0301 10:07:48.277110 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ltk8j"] Mar 01 10:07:48 crc kubenswrapper[4792]: I0301 10:07:48.287620 4792 scope.go:117] "RemoveContainer" containerID="01a152634348be67850a0ff44b28b264a62b707c7ec30ef00d95e9f0a4621d55" Mar 01 10:07:48 crc kubenswrapper[4792]: I0301 10:07:48.311480 4792 scope.go:117] "RemoveContainer" containerID="b863829c371b01a1972e917a6d1591c043a273261668f7d679b1616373a821a3" Mar 01 10:07:48 crc kubenswrapper[4792]: I0301 10:07:48.365133 4792 scope.go:117] "RemoveContainer" containerID="caf38347042ec7f347ab71967d0dc62fd40a1d00a3ad82bbffdf18b0ab66e178" Mar 01 10:07:48 crc kubenswrapper[4792]: E0301 10:07:48.365800 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caf38347042ec7f347ab71967d0dc62fd40a1d00a3ad82bbffdf18b0ab66e178\": container with ID starting with caf38347042ec7f347ab71967d0dc62fd40a1d00a3ad82bbffdf18b0ab66e178 not found: ID does not exist" containerID="caf38347042ec7f347ab71967d0dc62fd40a1d00a3ad82bbffdf18b0ab66e178" Mar 01 10:07:48 crc kubenswrapper[4792]: I0301 10:07:48.366026 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caf38347042ec7f347ab71967d0dc62fd40a1d00a3ad82bbffdf18b0ab66e178"} err="failed to get container status \"caf38347042ec7f347ab71967d0dc62fd40a1d00a3ad82bbffdf18b0ab66e178\": rpc error: code = NotFound desc = could not find container \"caf38347042ec7f347ab71967d0dc62fd40a1d00a3ad82bbffdf18b0ab66e178\": container with ID starting with caf38347042ec7f347ab71967d0dc62fd40a1d00a3ad82bbffdf18b0ab66e178 not found: ID does not exist" Mar 01 10:07:48 crc kubenswrapper[4792]: I0301 10:07:48.366058 4792 scope.go:117] "RemoveContainer" containerID="01a152634348be67850a0ff44b28b264a62b707c7ec30ef00d95e9f0a4621d55" Mar 01 10:07:48 crc kubenswrapper[4792]: E0301 10:07:48.366384 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01a152634348be67850a0ff44b28b264a62b707c7ec30ef00d95e9f0a4621d55\": container with ID starting with 01a152634348be67850a0ff44b28b264a62b707c7ec30ef00d95e9f0a4621d55 not found: ID does not exist" containerID="01a152634348be67850a0ff44b28b264a62b707c7ec30ef00d95e9f0a4621d55" Mar 01 10:07:48 crc kubenswrapper[4792]: I0301 10:07:48.366407 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01a152634348be67850a0ff44b28b264a62b707c7ec30ef00d95e9f0a4621d55"} err="failed to get container status \"01a152634348be67850a0ff44b28b264a62b707c7ec30ef00d95e9f0a4621d55\": rpc error: code = NotFound desc = could not find container \"01a152634348be67850a0ff44b28b264a62b707c7ec30ef00d95e9f0a4621d55\": container with ID starting with 01a152634348be67850a0ff44b28b264a62b707c7ec30ef00d95e9f0a4621d55 not found: ID does not exist" Mar 01 10:07:48 crc kubenswrapper[4792]: I0301 10:07:48.366421 4792 scope.go:117] "RemoveContainer" containerID="b863829c371b01a1972e917a6d1591c043a273261668f7d679b1616373a821a3" Mar 01 10:07:48 crc kubenswrapper[4792]: E0301 10:07:48.366931 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b863829c371b01a1972e917a6d1591c043a273261668f7d679b1616373a821a3\": container with ID starting with b863829c371b01a1972e917a6d1591c043a273261668f7d679b1616373a821a3 not found: ID does not exist" containerID="b863829c371b01a1972e917a6d1591c043a273261668f7d679b1616373a821a3" Mar 01 10:07:48 crc kubenswrapper[4792]: I0301 10:07:48.366972 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b863829c371b01a1972e917a6d1591c043a273261668f7d679b1616373a821a3"} err="failed to get container status \"b863829c371b01a1972e917a6d1591c043a273261668f7d679b1616373a821a3\": rpc error: code = NotFound desc = could not find container \"b863829c371b01a1972e917a6d1591c043a273261668f7d679b1616373a821a3\": container with ID starting with b863829c371b01a1972e917a6d1591c043a273261668f7d679b1616373a821a3 not found: ID does not exist" Mar 01 10:07:49 crc kubenswrapper[4792]: I0301 10:07:49.418671 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3512046c-8b15-4ab0-8c94-39d0d0f2e73c" path="/var/lib/kubelet/pods/3512046c-8b15-4ab0-8c94-39d0d0f2e73c/volumes" Mar 01 10:07:50 crc kubenswrapper[4792]: I0301 10:07:50.427315 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pph6g" Mar 01 10:07:50 crc kubenswrapper[4792]: I0301 10:07:50.504245 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pph6g" Mar 01 10:07:51 crc kubenswrapper[4792]: I0301 10:07:51.375693 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pph6g"] Mar 01 10:07:52 crc kubenswrapper[4792]: I0301 10:07:52.268655 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pph6g" podUID="8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef" containerName="registry-server" containerID="cri-o://f2df48ec621324b00f73adb4b8eeec69ea7344052fadf1865efb512b64a14a49" gracePeriod=2 Mar 01 10:07:52 crc kubenswrapper[4792]: I0301 10:07:52.881923 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pph6g" Mar 01 10:07:52 crc kubenswrapper[4792]: I0301 10:07:52.902396 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ch59q\" (UniqueName: \"kubernetes.io/projected/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef-kube-api-access-ch59q\") pod \"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef\" (UID: \"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef\") " Mar 01 10:07:52 crc kubenswrapper[4792]: I0301 10:07:52.902592 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef-catalog-content\") pod \"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef\" (UID: \"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef\") " Mar 01 10:07:52 crc kubenswrapper[4792]: I0301 10:07:52.902641 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef-utilities\") pod \"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef\" (UID: \"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef\") " Mar 01 10:07:52 crc kubenswrapper[4792]: I0301 10:07:52.904302 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef-utilities" (OuterVolumeSpecName: "utilities") pod "8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef" (UID: "8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:07:52 crc kubenswrapper[4792]: I0301 10:07:52.916207 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef-kube-api-access-ch59q" (OuterVolumeSpecName: "kube-api-access-ch59q") pod "8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef" (UID: "8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef"). InnerVolumeSpecName "kube-api-access-ch59q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.020754 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ch59q\" (UniqueName: \"kubernetes.io/projected/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef-kube-api-access-ch59q\") on node \"crc\" DevicePath \"\"" Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.020784 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.059119 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef" (UID: "8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.125687 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.278973 4792 generic.go:334] "Generic (PLEG): container finished" podID="8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef" containerID="f2df48ec621324b00f73adb4b8eeec69ea7344052fadf1865efb512b64a14a49" exitCode=0 Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.279017 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pph6g" event={"ID":"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef","Type":"ContainerDied","Data":"f2df48ec621324b00f73adb4b8eeec69ea7344052fadf1865efb512b64a14a49"} Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.279045 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pph6g" event={"ID":"8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef","Type":"ContainerDied","Data":"20eeae96dbcf40cf915287f68860afaea37e4de6c7d4d59e23e4ddf89a89c82c"} Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.279054 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pph6g" Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.279061 4792 scope.go:117] "RemoveContainer" containerID="f2df48ec621324b00f73adb4b8eeec69ea7344052fadf1865efb512b64a14a49" Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.296509 4792 scope.go:117] "RemoveContainer" containerID="e98ec606238ece22cf5bf54f0d9b14ebe2b48e8479d47b0d013ac5c426816a04" Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.319208 4792 scope.go:117] "RemoveContainer" containerID="bdb69b17e8cbcb56946bda3a57e20355267927603056c7235362b2f0563bbc74" Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.323545 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pph6g"] Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.332103 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pph6g"] Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.369228 4792 scope.go:117] "RemoveContainer" containerID="f2df48ec621324b00f73adb4b8eeec69ea7344052fadf1865efb512b64a14a49" Mar 01 10:07:53 crc kubenswrapper[4792]: E0301 10:07:53.369555 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2df48ec621324b00f73adb4b8eeec69ea7344052fadf1865efb512b64a14a49\": container with ID starting with f2df48ec621324b00f73adb4b8eeec69ea7344052fadf1865efb512b64a14a49 not found: ID does not exist" containerID="f2df48ec621324b00f73adb4b8eeec69ea7344052fadf1865efb512b64a14a49" Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.369586 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2df48ec621324b00f73adb4b8eeec69ea7344052fadf1865efb512b64a14a49"} err="failed to get container status \"f2df48ec621324b00f73adb4b8eeec69ea7344052fadf1865efb512b64a14a49\": rpc error: code = NotFound desc = could not find container \"f2df48ec621324b00f73adb4b8eeec69ea7344052fadf1865efb512b64a14a49\": container with ID starting with f2df48ec621324b00f73adb4b8eeec69ea7344052fadf1865efb512b64a14a49 not found: ID does not exist" Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.369607 4792 scope.go:117] "RemoveContainer" containerID="e98ec606238ece22cf5bf54f0d9b14ebe2b48e8479d47b0d013ac5c426816a04" Mar 01 10:07:53 crc kubenswrapper[4792]: E0301 10:07:53.369933 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e98ec606238ece22cf5bf54f0d9b14ebe2b48e8479d47b0d013ac5c426816a04\": container with ID starting with e98ec606238ece22cf5bf54f0d9b14ebe2b48e8479d47b0d013ac5c426816a04 not found: ID does not exist" containerID="e98ec606238ece22cf5bf54f0d9b14ebe2b48e8479d47b0d013ac5c426816a04" Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.369953 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e98ec606238ece22cf5bf54f0d9b14ebe2b48e8479d47b0d013ac5c426816a04"} err="failed to get container status \"e98ec606238ece22cf5bf54f0d9b14ebe2b48e8479d47b0d013ac5c426816a04\": rpc error: code = NotFound desc = could not find container \"e98ec606238ece22cf5bf54f0d9b14ebe2b48e8479d47b0d013ac5c426816a04\": container with ID starting with e98ec606238ece22cf5bf54f0d9b14ebe2b48e8479d47b0d013ac5c426816a04 not found: ID does not exist" Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.369970 4792 scope.go:117] "RemoveContainer" containerID="bdb69b17e8cbcb56946bda3a57e20355267927603056c7235362b2f0563bbc74" Mar 01 10:07:53 crc kubenswrapper[4792]: E0301 10:07:53.370263 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdb69b17e8cbcb56946bda3a57e20355267927603056c7235362b2f0563bbc74\": container with ID starting with bdb69b17e8cbcb56946bda3a57e20355267927603056c7235362b2f0563bbc74 not found: ID does not exist" containerID="bdb69b17e8cbcb56946bda3a57e20355267927603056c7235362b2f0563bbc74" Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.370285 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdb69b17e8cbcb56946bda3a57e20355267927603056c7235362b2f0563bbc74"} err="failed to get container status \"bdb69b17e8cbcb56946bda3a57e20355267927603056c7235362b2f0563bbc74\": rpc error: code = NotFound desc = could not find container \"bdb69b17e8cbcb56946bda3a57e20355267927603056c7235362b2f0563bbc74\": container with ID starting with bdb69b17e8cbcb56946bda3a57e20355267927603056c7235362b2f0563bbc74 not found: ID does not exist" Mar 01 10:07:53 crc kubenswrapper[4792]: I0301 10:07:53.418626 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef" path="/var/lib/kubelet/pods/8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef/volumes" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.146925 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539328-6t6hj"] Mar 01 10:08:00 crc kubenswrapper[4792]: E0301 10:08:00.147895 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3512046c-8b15-4ab0-8c94-39d0d0f2e73c" containerName="extract-utilities" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.147930 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3512046c-8b15-4ab0-8c94-39d0d0f2e73c" containerName="extract-utilities" Mar 01 10:08:00 crc kubenswrapper[4792]: E0301 10:08:00.147941 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3512046c-8b15-4ab0-8c94-39d0d0f2e73c" containerName="extract-content" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.147949 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3512046c-8b15-4ab0-8c94-39d0d0f2e73c" containerName="extract-content" Mar 01 10:08:00 crc kubenswrapper[4792]: E0301 10:08:00.147959 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="682f4383-d3fb-4efe-89f5-e496b4b3b71b" containerName="extract-utilities" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.147968 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="682f4383-d3fb-4efe-89f5-e496b4b3b71b" containerName="extract-utilities" Mar 01 10:08:00 crc kubenswrapper[4792]: E0301 10:08:00.147993 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef" containerName="extract-utilities" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.148002 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef" containerName="extract-utilities" Mar 01 10:08:00 crc kubenswrapper[4792]: E0301 10:08:00.148015 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3512046c-8b15-4ab0-8c94-39d0d0f2e73c" containerName="registry-server" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.148023 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3512046c-8b15-4ab0-8c94-39d0d0f2e73c" containerName="registry-server" Mar 01 10:08:00 crc kubenswrapper[4792]: E0301 10:08:00.148038 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="682f4383-d3fb-4efe-89f5-e496b4b3b71b" containerName="registry-server" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.148045 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="682f4383-d3fb-4efe-89f5-e496b4b3b71b" containerName="registry-server" Mar 01 10:08:00 crc kubenswrapper[4792]: E0301 10:08:00.148067 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef" containerName="extract-content" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.148075 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef" containerName="extract-content" Mar 01 10:08:00 crc kubenswrapper[4792]: E0301 10:08:00.148091 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="682f4383-d3fb-4efe-89f5-e496b4b3b71b" containerName="extract-content" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.148099 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="682f4383-d3fb-4efe-89f5-e496b4b3b71b" containerName="extract-content" Mar 01 10:08:00 crc kubenswrapper[4792]: E0301 10:08:00.148118 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef" containerName="registry-server" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.148125 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef" containerName="registry-server" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.148336 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="3512046c-8b15-4ab0-8c94-39d0d0f2e73c" containerName="registry-server" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.148356 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="682f4383-d3fb-4efe-89f5-e496b4b3b71b" containerName="registry-server" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.148383 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d7e13c7-2fc8-4f25-b93a-132d0b0c80ef" containerName="registry-server" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.149141 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539328-6t6hj" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.151943 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.152197 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.152412 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.159351 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539328-6t6hj"] Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.259462 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmmjq\" (UniqueName: \"kubernetes.io/projected/42e71344-31b1-4817-b2e4-dd9aebb9d38e-kube-api-access-qmmjq\") pod \"auto-csr-approver-29539328-6t6hj\" (UID: \"42e71344-31b1-4817-b2e4-dd9aebb9d38e\") " pod="openshift-infra/auto-csr-approver-29539328-6t6hj" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.361154 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmmjq\" (UniqueName: \"kubernetes.io/projected/42e71344-31b1-4817-b2e4-dd9aebb9d38e-kube-api-access-qmmjq\") pod \"auto-csr-approver-29539328-6t6hj\" (UID: \"42e71344-31b1-4817-b2e4-dd9aebb9d38e\") " pod="openshift-infra/auto-csr-approver-29539328-6t6hj" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.394955 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmmjq\" (UniqueName: \"kubernetes.io/projected/42e71344-31b1-4817-b2e4-dd9aebb9d38e-kube-api-access-qmmjq\") pod \"auto-csr-approver-29539328-6t6hj\" (UID: \"42e71344-31b1-4817-b2e4-dd9aebb9d38e\") " pod="openshift-infra/auto-csr-approver-29539328-6t6hj" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.468328 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539328-6t6hj" Mar 01 10:08:00 crc kubenswrapper[4792]: I0301 10:08:00.952371 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539328-6t6hj"] Mar 01 10:08:01 crc kubenswrapper[4792]: I0301 10:08:01.341476 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539328-6t6hj" event={"ID":"42e71344-31b1-4817-b2e4-dd9aebb9d38e","Type":"ContainerStarted","Data":"df7225161bec7252d57bf5e2340343fa6752a0760a457d189ea4e699057f74c6"} Mar 01 10:08:02 crc kubenswrapper[4792]: I0301 10:08:02.354485 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539328-6t6hj" event={"ID":"42e71344-31b1-4817-b2e4-dd9aebb9d38e","Type":"ContainerStarted","Data":"c3e0ad9fbd90b282daf14548418069de81001946b8230adb9c02a7933419e3fe"} Mar 01 10:08:02 crc kubenswrapper[4792]: I0301 10:08:02.377572 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29539328-6t6hj" podStartSLOduration=1.572232772 podStartE2EDuration="2.377546733s" podCreationTimestamp="2026-03-01 10:08:00 +0000 UTC" firstStartedPulling="2026-03-01 10:08:00.971705255 +0000 UTC m=+3610.213584452" lastFinishedPulling="2026-03-01 10:08:01.777019216 +0000 UTC m=+3611.018898413" observedRunningTime="2026-03-01 10:08:02.370554009 +0000 UTC m=+3611.612433226" watchObservedRunningTime="2026-03-01 10:08:02.377546733 +0000 UTC m=+3611.619425940" Mar 01 10:08:03 crc kubenswrapper[4792]: I0301 10:08:03.364173 4792 generic.go:334] "Generic (PLEG): container finished" podID="42e71344-31b1-4817-b2e4-dd9aebb9d38e" containerID="c3e0ad9fbd90b282daf14548418069de81001946b8230adb9c02a7933419e3fe" exitCode=0 Mar 01 10:08:03 crc kubenswrapper[4792]: I0301 10:08:03.364217 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539328-6t6hj" event={"ID":"42e71344-31b1-4817-b2e4-dd9aebb9d38e","Type":"ContainerDied","Data":"c3e0ad9fbd90b282daf14548418069de81001946b8230adb9c02a7933419e3fe"} Mar 01 10:08:04 crc kubenswrapper[4792]: I0301 10:08:04.759981 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539328-6t6hj" Mar 01 10:08:04 crc kubenswrapper[4792]: I0301 10:08:04.851787 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmmjq\" (UniqueName: \"kubernetes.io/projected/42e71344-31b1-4817-b2e4-dd9aebb9d38e-kube-api-access-qmmjq\") pod \"42e71344-31b1-4817-b2e4-dd9aebb9d38e\" (UID: \"42e71344-31b1-4817-b2e4-dd9aebb9d38e\") " Mar 01 10:08:04 crc kubenswrapper[4792]: I0301 10:08:04.857371 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42e71344-31b1-4817-b2e4-dd9aebb9d38e-kube-api-access-qmmjq" (OuterVolumeSpecName: "kube-api-access-qmmjq") pod "42e71344-31b1-4817-b2e4-dd9aebb9d38e" (UID: "42e71344-31b1-4817-b2e4-dd9aebb9d38e"). InnerVolumeSpecName "kube-api-access-qmmjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:08:04 crc kubenswrapper[4792]: I0301 10:08:04.954407 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmmjq\" (UniqueName: \"kubernetes.io/projected/42e71344-31b1-4817-b2e4-dd9aebb9d38e-kube-api-access-qmmjq\") on node \"crc\" DevicePath \"\"" Mar 01 10:08:05 crc kubenswrapper[4792]: I0301 10:08:05.386957 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539328-6t6hj" event={"ID":"42e71344-31b1-4817-b2e4-dd9aebb9d38e","Type":"ContainerDied","Data":"df7225161bec7252d57bf5e2340343fa6752a0760a457d189ea4e699057f74c6"} Mar 01 10:08:05 crc kubenswrapper[4792]: I0301 10:08:05.387013 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df7225161bec7252d57bf5e2340343fa6752a0760a457d189ea4e699057f74c6" Mar 01 10:08:05 crc kubenswrapper[4792]: I0301 10:08:05.387010 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539328-6t6hj" Mar 01 10:08:05 crc kubenswrapper[4792]: I0301 10:08:05.842328 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539322-cwkrx"] Mar 01 10:08:05 crc kubenswrapper[4792]: I0301 10:08:05.870645 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539322-cwkrx"] Mar 01 10:08:07 crc kubenswrapper[4792]: I0301 10:08:07.419060 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b" path="/var/lib/kubelet/pods/d1097b0e-1156-4f3c-b1e9-6f7b83d0e07b/volumes" Mar 01 10:08:32 crc kubenswrapper[4792]: I0301 10:08:32.696983 4792 scope.go:117] "RemoveContainer" containerID="ec82ee72087ccca998ba33c4f0000c5036f1693b6f0bbb69a46ea60c4b7efd7e" Mar 01 10:08:32 crc kubenswrapper[4792]: I0301 10:08:32.773062 4792 scope.go:117] "RemoveContainer" containerID="e9350b459945620c5d55700a03e5aa49740d1a4854b26be997bf075b28191bfe" Mar 01 10:08:32 crc kubenswrapper[4792]: I0301 10:08:32.806219 4792 scope.go:117] "RemoveContainer" containerID="f8623324113a06ed0f840a7abbd929eaf57c541d1aae8f4b05f1bf6b431b24fe" Mar 01 10:09:04 crc kubenswrapper[4792]: I0301 10:09:04.943478 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:09:04 crc kubenswrapper[4792]: I0301 10:09:04.944760 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:09:34 crc kubenswrapper[4792]: I0301 10:09:34.942668 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:09:34 crc kubenswrapper[4792]: I0301 10:09:34.943098 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:10:00 crc kubenswrapper[4792]: I0301 10:10:00.193316 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539330-wqct7"] Mar 01 10:10:00 crc kubenswrapper[4792]: E0301 10:10:00.193997 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42e71344-31b1-4817-b2e4-dd9aebb9d38e" containerName="oc" Mar 01 10:10:00 crc kubenswrapper[4792]: I0301 10:10:00.194010 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e71344-31b1-4817-b2e4-dd9aebb9d38e" containerName="oc" Mar 01 10:10:00 crc kubenswrapper[4792]: I0301 10:10:00.194221 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="42e71344-31b1-4817-b2e4-dd9aebb9d38e" containerName="oc" Mar 01 10:10:00 crc kubenswrapper[4792]: I0301 10:10:00.194884 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539330-wqct7" Mar 01 10:10:00 crc kubenswrapper[4792]: I0301 10:10:00.198314 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:10:00 crc kubenswrapper[4792]: I0301 10:10:00.198503 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:10:00 crc kubenswrapper[4792]: I0301 10:10:00.198629 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:10:00 crc kubenswrapper[4792]: I0301 10:10:00.206772 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539330-wqct7"] Mar 01 10:10:00 crc kubenswrapper[4792]: I0301 10:10:00.295507 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9chm\" (UniqueName: \"kubernetes.io/projected/a9025376-622f-4b7e-94d5-c5b136e139d8-kube-api-access-r9chm\") pod \"auto-csr-approver-29539330-wqct7\" (UID: \"a9025376-622f-4b7e-94d5-c5b136e139d8\") " pod="openshift-infra/auto-csr-approver-29539330-wqct7" Mar 01 10:10:00 crc kubenswrapper[4792]: I0301 10:10:00.396979 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9chm\" (UniqueName: \"kubernetes.io/projected/a9025376-622f-4b7e-94d5-c5b136e139d8-kube-api-access-r9chm\") pod \"auto-csr-approver-29539330-wqct7\" (UID: \"a9025376-622f-4b7e-94d5-c5b136e139d8\") " pod="openshift-infra/auto-csr-approver-29539330-wqct7" Mar 01 10:10:00 crc kubenswrapper[4792]: I0301 10:10:00.415374 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9chm\" (UniqueName: \"kubernetes.io/projected/a9025376-622f-4b7e-94d5-c5b136e139d8-kube-api-access-r9chm\") pod \"auto-csr-approver-29539330-wqct7\" (UID: \"a9025376-622f-4b7e-94d5-c5b136e139d8\") " pod="openshift-infra/auto-csr-approver-29539330-wqct7" Mar 01 10:10:00 crc kubenswrapper[4792]: I0301 10:10:00.514550 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539330-wqct7" Mar 01 10:10:00 crc kubenswrapper[4792]: I0301 10:10:00.986002 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539330-wqct7"] Mar 01 10:10:01 crc kubenswrapper[4792]: I0301 10:10:01.394197 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539330-wqct7" event={"ID":"a9025376-622f-4b7e-94d5-c5b136e139d8","Type":"ContainerStarted","Data":"b1a908e36847c130d22fa7138ad124b33ebf296414af538320c02b60d31d23bc"} Mar 01 10:10:02 crc kubenswrapper[4792]: I0301 10:10:02.403010 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539330-wqct7" event={"ID":"a9025376-622f-4b7e-94d5-c5b136e139d8","Type":"ContainerStarted","Data":"efa17f624abb4097f736a6d5cc9c3131b915c07d36d54a5857fd159d31a5a613"} Mar 01 10:10:02 crc kubenswrapper[4792]: I0301 10:10:02.420196 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29539330-wqct7" podStartSLOduration=1.272680491 podStartE2EDuration="2.420174711s" podCreationTimestamp="2026-03-01 10:10:00 +0000 UTC" firstStartedPulling="2026-03-01 10:10:00.988252172 +0000 UTC m=+3730.230131409" lastFinishedPulling="2026-03-01 10:10:02.135746432 +0000 UTC m=+3731.377625629" observedRunningTime="2026-03-01 10:10:02.420028538 +0000 UTC m=+3731.661907735" watchObservedRunningTime="2026-03-01 10:10:02.420174711 +0000 UTC m=+3731.662053898" Mar 01 10:10:03 crc kubenswrapper[4792]: I0301 10:10:03.418933 4792 generic.go:334] "Generic (PLEG): container finished" podID="a9025376-622f-4b7e-94d5-c5b136e139d8" containerID="efa17f624abb4097f736a6d5cc9c3131b915c07d36d54a5857fd159d31a5a613" exitCode=0 Mar 01 10:10:03 crc kubenswrapper[4792]: I0301 10:10:03.419004 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539330-wqct7" event={"ID":"a9025376-622f-4b7e-94d5-c5b136e139d8","Type":"ContainerDied","Data":"efa17f624abb4097f736a6d5cc9c3131b915c07d36d54a5857fd159d31a5a613"} Mar 01 10:10:04 crc kubenswrapper[4792]: I0301 10:10:04.834456 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539330-wqct7" Mar 01 10:10:04 crc kubenswrapper[4792]: I0301 10:10:04.891332 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9chm\" (UniqueName: \"kubernetes.io/projected/a9025376-622f-4b7e-94d5-c5b136e139d8-kube-api-access-r9chm\") pod \"a9025376-622f-4b7e-94d5-c5b136e139d8\" (UID: \"a9025376-622f-4b7e-94d5-c5b136e139d8\") " Mar 01 10:10:04 crc kubenswrapper[4792]: I0301 10:10:04.897105 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9025376-622f-4b7e-94d5-c5b136e139d8-kube-api-access-r9chm" (OuterVolumeSpecName: "kube-api-access-r9chm") pod "a9025376-622f-4b7e-94d5-c5b136e139d8" (UID: "a9025376-622f-4b7e-94d5-c5b136e139d8"). InnerVolumeSpecName "kube-api-access-r9chm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:10:04 crc kubenswrapper[4792]: I0301 10:10:04.944365 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:10:04 crc kubenswrapper[4792]: I0301 10:10:04.944584 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:10:04 crc kubenswrapper[4792]: I0301 10:10:04.944718 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 10:10:04 crc kubenswrapper[4792]: I0301 10:10:04.948867 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"947fc3b8131dfb1dafd7693a1530cb8db816d30e8340c8877587b0f93967d46d"} pod="openshift-machine-config-operator/machine-config-daemon-bqszv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 01 10:10:04 crc kubenswrapper[4792]: I0301 10:10:04.949107 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" containerID="cri-o://947fc3b8131dfb1dafd7693a1530cb8db816d30e8340c8877587b0f93967d46d" gracePeriod=600 Mar 01 10:10:04 crc kubenswrapper[4792]: I0301 10:10:04.996109 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9chm\" (UniqueName: \"kubernetes.io/projected/a9025376-622f-4b7e-94d5-c5b136e139d8-kube-api-access-r9chm\") on node \"crc\" DevicePath \"\"" Mar 01 10:10:05 crc kubenswrapper[4792]: I0301 10:10:05.443579 4792 generic.go:334] "Generic (PLEG): container finished" podID="9105f6b0-6f16-47aa-8009-73736a90b765" containerID="947fc3b8131dfb1dafd7693a1530cb8db816d30e8340c8877587b0f93967d46d" exitCode=0 Mar 01 10:10:05 crc kubenswrapper[4792]: I0301 10:10:05.443737 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerDied","Data":"947fc3b8131dfb1dafd7693a1530cb8db816d30e8340c8877587b0f93967d46d"} Mar 01 10:10:05 crc kubenswrapper[4792]: I0301 10:10:05.443857 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de"} Mar 01 10:10:05 crc kubenswrapper[4792]: I0301 10:10:05.443874 4792 scope.go:117] "RemoveContainer" containerID="a0784f01e11c857b74dcb74a33ddd84c26dac9667c59a3f25e6ad7134aeabd9b" Mar 01 10:10:05 crc kubenswrapper[4792]: I0301 10:10:05.449482 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539330-wqct7" Mar 01 10:10:05 crc kubenswrapper[4792]: I0301 10:10:05.449396 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539330-wqct7" event={"ID":"a9025376-622f-4b7e-94d5-c5b136e139d8","Type":"ContainerDied","Data":"b1a908e36847c130d22fa7138ad124b33ebf296414af538320c02b60d31d23bc"} Mar 01 10:10:05 crc kubenswrapper[4792]: I0301 10:10:05.454938 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1a908e36847c130d22fa7138ad124b33ebf296414af538320c02b60d31d23bc" Mar 01 10:10:05 crc kubenswrapper[4792]: I0301 10:10:05.900216 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539324-wjq94"] Mar 01 10:10:05 crc kubenswrapper[4792]: I0301 10:10:05.908858 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539324-wjq94"] Mar 01 10:10:07 crc kubenswrapper[4792]: I0301 10:10:07.429484 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41b7071d-7243-4ca4-82e6-c153c3001d1f" path="/var/lib/kubelet/pods/41b7071d-7243-4ca4-82e6-c153c3001d1f/volumes" Mar 01 10:10:32 crc kubenswrapper[4792]: I0301 10:10:32.962793 4792 scope.go:117] "RemoveContainer" containerID="e004ff88b9b0e0f691af76b372e4044089dce1eaaf6619be9de9fed1b4d58c18" Mar 01 10:10:56 crc kubenswrapper[4792]: I0301 10:10:56.039443 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-svb22"] Mar 01 10:10:56 crc kubenswrapper[4792]: I0301 10:10:56.048699 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-svb22"] Mar 01 10:10:56 crc kubenswrapper[4792]: I0301 10:10:56.058571 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-fa75-account-create-update-s2kng"] Mar 01 10:10:56 crc kubenswrapper[4792]: I0301 10:10:56.068723 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-fa75-account-create-update-s2kng"] Mar 01 10:10:57 crc kubenswrapper[4792]: I0301 10:10:57.419392 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0441b486-847a-4f32-8df2-a1284f39ee5d" path="/var/lib/kubelet/pods/0441b486-847a-4f32-8df2-a1284f39ee5d/volumes" Mar 01 10:10:57 crc kubenswrapper[4792]: I0301 10:10:57.421681 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7736148-bc12-4621-a1d2-efc4a0143b42" path="/var/lib/kubelet/pods/c7736148-bc12-4621-a1d2-efc4a0143b42/volumes" Mar 01 10:11:31 crc kubenswrapper[4792]: I0301 10:11:31.047226 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-p2dtn"] Mar 01 10:11:31 crc kubenswrapper[4792]: I0301 10:11:31.059847 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-p2dtn"] Mar 01 10:11:31 crc kubenswrapper[4792]: I0301 10:11:31.420061 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01bf5dae-6217-4644-9c9b-65d3886a4dc1" path="/var/lib/kubelet/pods/01bf5dae-6217-4644-9c9b-65d3886a4dc1/volumes" Mar 01 10:11:33 crc kubenswrapper[4792]: I0301 10:11:33.036519 4792 scope.go:117] "RemoveContainer" containerID="bf57ceafd6066a28052f3666ed7d384740c5837c8329274397bd6fa48c44d661" Mar 01 10:11:33 crc kubenswrapper[4792]: I0301 10:11:33.076558 4792 scope.go:117] "RemoveContainer" containerID="a0b92e5e12c9f6c28a62c7a978997e469e5d99be007aba61824b2b6c8d62ffa5" Mar 01 10:11:33 crc kubenswrapper[4792]: I0301 10:11:33.123136 4792 scope.go:117] "RemoveContainer" containerID="31b1b88de471cccce9a774cf2494168f31cb583ae731b5c4c5efee9a815b5533" Mar 01 10:12:00 crc kubenswrapper[4792]: I0301 10:12:00.147402 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539332-f6brk"] Mar 01 10:12:00 crc kubenswrapper[4792]: E0301 10:12:00.148292 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9025376-622f-4b7e-94d5-c5b136e139d8" containerName="oc" Mar 01 10:12:00 crc kubenswrapper[4792]: I0301 10:12:00.148307 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9025376-622f-4b7e-94d5-c5b136e139d8" containerName="oc" Mar 01 10:12:00 crc kubenswrapper[4792]: I0301 10:12:00.148503 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9025376-622f-4b7e-94d5-c5b136e139d8" containerName="oc" Mar 01 10:12:00 crc kubenswrapper[4792]: I0301 10:12:00.149369 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539332-f6brk" Mar 01 10:12:00 crc kubenswrapper[4792]: I0301 10:12:00.152042 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:12:00 crc kubenswrapper[4792]: I0301 10:12:00.152322 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:12:00 crc kubenswrapper[4792]: I0301 10:12:00.156574 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539332-f6brk"] Mar 01 10:12:00 crc kubenswrapper[4792]: I0301 10:12:00.158021 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:12:00 crc kubenswrapper[4792]: I0301 10:12:00.283239 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj8q7\" (UniqueName: \"kubernetes.io/projected/5f9c6b5a-8834-45a7-bf9d-000bcfd068f3-kube-api-access-sj8q7\") pod \"auto-csr-approver-29539332-f6brk\" (UID: \"5f9c6b5a-8834-45a7-bf9d-000bcfd068f3\") " pod="openshift-infra/auto-csr-approver-29539332-f6brk" Mar 01 10:12:00 crc kubenswrapper[4792]: I0301 10:12:00.384622 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj8q7\" (UniqueName: \"kubernetes.io/projected/5f9c6b5a-8834-45a7-bf9d-000bcfd068f3-kube-api-access-sj8q7\") pod \"auto-csr-approver-29539332-f6brk\" (UID: \"5f9c6b5a-8834-45a7-bf9d-000bcfd068f3\") " pod="openshift-infra/auto-csr-approver-29539332-f6brk" Mar 01 10:12:00 crc kubenswrapper[4792]: I0301 10:12:00.409120 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj8q7\" (UniqueName: \"kubernetes.io/projected/5f9c6b5a-8834-45a7-bf9d-000bcfd068f3-kube-api-access-sj8q7\") pod \"auto-csr-approver-29539332-f6brk\" (UID: \"5f9c6b5a-8834-45a7-bf9d-000bcfd068f3\") " pod="openshift-infra/auto-csr-approver-29539332-f6brk" Mar 01 10:12:00 crc kubenswrapper[4792]: I0301 10:12:00.477793 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539332-f6brk" Mar 01 10:12:00 crc kubenswrapper[4792]: I0301 10:12:00.918786 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539332-f6brk"] Mar 01 10:12:01 crc kubenswrapper[4792]: I0301 10:12:01.450273 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539332-f6brk" event={"ID":"5f9c6b5a-8834-45a7-bf9d-000bcfd068f3","Type":"ContainerStarted","Data":"aaf8adc92bf2959fc5697ba19835736125cffd65ce1dccd42bd001b1bacd2428"} Mar 01 10:12:02 crc kubenswrapper[4792]: I0301 10:12:02.464800 4792 generic.go:334] "Generic (PLEG): container finished" podID="5f9c6b5a-8834-45a7-bf9d-000bcfd068f3" containerID="92df935b0380167f616deb42823e3a7a1744b9e85454d7327ad662b50fab1963" exitCode=0 Mar 01 10:12:02 crc kubenswrapper[4792]: I0301 10:12:02.465596 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539332-f6brk" event={"ID":"5f9c6b5a-8834-45a7-bf9d-000bcfd068f3","Type":"ContainerDied","Data":"92df935b0380167f616deb42823e3a7a1744b9e85454d7327ad662b50fab1963"} Mar 01 10:12:04 crc kubenswrapper[4792]: I0301 10:12:04.482504 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539332-f6brk" event={"ID":"5f9c6b5a-8834-45a7-bf9d-000bcfd068f3","Type":"ContainerDied","Data":"aaf8adc92bf2959fc5697ba19835736125cffd65ce1dccd42bd001b1bacd2428"} Mar 01 10:12:04 crc kubenswrapper[4792]: I0301 10:12:04.482774 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aaf8adc92bf2959fc5697ba19835736125cffd65ce1dccd42bd001b1bacd2428" Mar 01 10:12:04 crc kubenswrapper[4792]: I0301 10:12:04.506023 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539332-f6brk" Mar 01 10:12:04 crc kubenswrapper[4792]: I0301 10:12:04.572970 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj8q7\" (UniqueName: \"kubernetes.io/projected/5f9c6b5a-8834-45a7-bf9d-000bcfd068f3-kube-api-access-sj8q7\") pod \"5f9c6b5a-8834-45a7-bf9d-000bcfd068f3\" (UID: \"5f9c6b5a-8834-45a7-bf9d-000bcfd068f3\") " Mar 01 10:12:04 crc kubenswrapper[4792]: I0301 10:12:04.587771 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f9c6b5a-8834-45a7-bf9d-000bcfd068f3-kube-api-access-sj8q7" (OuterVolumeSpecName: "kube-api-access-sj8q7") pod "5f9c6b5a-8834-45a7-bf9d-000bcfd068f3" (UID: "5f9c6b5a-8834-45a7-bf9d-000bcfd068f3"). InnerVolumeSpecName "kube-api-access-sj8q7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:12:04 crc kubenswrapper[4792]: I0301 10:12:04.675834 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj8q7\" (UniqueName: \"kubernetes.io/projected/5f9c6b5a-8834-45a7-bf9d-000bcfd068f3-kube-api-access-sj8q7\") on node \"crc\" DevicePath \"\"" Mar 01 10:12:05 crc kubenswrapper[4792]: I0301 10:12:05.490420 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539332-f6brk" Mar 01 10:12:05 crc kubenswrapper[4792]: I0301 10:12:05.581631 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539326-7hz59"] Mar 01 10:12:05 crc kubenswrapper[4792]: I0301 10:12:05.589893 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539326-7hz59"] Mar 01 10:12:07 crc kubenswrapper[4792]: I0301 10:12:07.419544 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8" path="/var/lib/kubelet/pods/e772cc9c-90e7-4e3a-b48d-bb13a4aa0cc8/volumes" Mar 01 10:12:33 crc kubenswrapper[4792]: I0301 10:12:33.218467 4792 scope.go:117] "RemoveContainer" containerID="985260a2ba2153789f87b2fc888d57bf9b851f86fd15aaf0dca4797eeb86773f" Mar 01 10:12:34 crc kubenswrapper[4792]: I0301 10:12:34.943366 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:12:34 crc kubenswrapper[4792]: I0301 10:12:34.944046 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:13:04 crc kubenswrapper[4792]: I0301 10:13:04.942463 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:13:04 crc kubenswrapper[4792]: I0301 10:13:04.942948 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:13:34 crc kubenswrapper[4792]: I0301 10:13:34.943437 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:13:34 crc kubenswrapper[4792]: I0301 10:13:34.944032 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:13:34 crc kubenswrapper[4792]: I0301 10:13:34.944080 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 10:13:34 crc kubenswrapper[4792]: I0301 10:13:34.944930 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de"} pod="openshift-machine-config-operator/machine-config-daemon-bqszv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 01 10:13:34 crc kubenswrapper[4792]: I0301 10:13:34.944993 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" containerID="cri-o://5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" gracePeriod=600 Mar 01 10:13:35 crc kubenswrapper[4792]: E0301 10:13:35.091229 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:13:35 crc kubenswrapper[4792]: I0301 10:13:35.838823 4792 generic.go:334] "Generic (PLEG): container finished" podID="9105f6b0-6f16-47aa-8009-73736a90b765" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" exitCode=0 Mar 01 10:13:35 crc kubenswrapper[4792]: I0301 10:13:35.838871 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerDied","Data":"5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de"} Mar 01 10:13:35 crc kubenswrapper[4792]: I0301 10:13:35.838939 4792 scope.go:117] "RemoveContainer" containerID="947fc3b8131dfb1dafd7693a1530cb8db816d30e8340c8877587b0f93967d46d" Mar 01 10:13:35 crc kubenswrapper[4792]: I0301 10:13:35.840051 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:13:35 crc kubenswrapper[4792]: E0301 10:13:35.840402 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:13:50 crc kubenswrapper[4792]: I0301 10:13:50.409676 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:13:50 crc kubenswrapper[4792]: E0301 10:13:50.410502 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:14:00 crc kubenswrapper[4792]: I0301 10:14:00.152568 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539334-c4cd2"] Mar 01 10:14:00 crc kubenswrapper[4792]: E0301 10:14:00.154930 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f9c6b5a-8834-45a7-bf9d-000bcfd068f3" containerName="oc" Mar 01 10:14:00 crc kubenswrapper[4792]: I0301 10:14:00.155037 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f9c6b5a-8834-45a7-bf9d-000bcfd068f3" containerName="oc" Mar 01 10:14:00 crc kubenswrapper[4792]: I0301 10:14:00.155391 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f9c6b5a-8834-45a7-bf9d-000bcfd068f3" containerName="oc" Mar 01 10:14:00 crc kubenswrapper[4792]: I0301 10:14:00.158221 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539334-c4cd2" Mar 01 10:14:00 crc kubenswrapper[4792]: I0301 10:14:00.162225 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:14:00 crc kubenswrapper[4792]: I0301 10:14:00.162448 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:14:00 crc kubenswrapper[4792]: I0301 10:14:00.162598 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:14:00 crc kubenswrapper[4792]: I0301 10:14:00.178680 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539334-c4cd2"] Mar 01 10:14:00 crc kubenswrapper[4792]: I0301 10:14:00.335073 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdzn4\" (UniqueName: \"kubernetes.io/projected/807c2da3-ef0e-4e89-9457-37401354a8e9-kube-api-access-vdzn4\") pod \"auto-csr-approver-29539334-c4cd2\" (UID: \"807c2da3-ef0e-4e89-9457-37401354a8e9\") " pod="openshift-infra/auto-csr-approver-29539334-c4cd2" Mar 01 10:14:00 crc kubenswrapper[4792]: I0301 10:14:00.437694 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdzn4\" (UniqueName: \"kubernetes.io/projected/807c2da3-ef0e-4e89-9457-37401354a8e9-kube-api-access-vdzn4\") pod \"auto-csr-approver-29539334-c4cd2\" (UID: \"807c2da3-ef0e-4e89-9457-37401354a8e9\") " pod="openshift-infra/auto-csr-approver-29539334-c4cd2" Mar 01 10:14:00 crc kubenswrapper[4792]: I0301 10:14:00.456462 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdzn4\" (UniqueName: \"kubernetes.io/projected/807c2da3-ef0e-4e89-9457-37401354a8e9-kube-api-access-vdzn4\") pod \"auto-csr-approver-29539334-c4cd2\" (UID: \"807c2da3-ef0e-4e89-9457-37401354a8e9\") " pod="openshift-infra/auto-csr-approver-29539334-c4cd2" Mar 01 10:14:00 crc kubenswrapper[4792]: I0301 10:14:00.484391 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539334-c4cd2" Mar 01 10:14:00 crc kubenswrapper[4792]: I0301 10:14:00.950296 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539334-c4cd2"] Mar 01 10:14:00 crc kubenswrapper[4792]: I0301 10:14:00.953708 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 01 10:14:01 crc kubenswrapper[4792]: I0301 10:14:01.064071 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539334-c4cd2" event={"ID":"807c2da3-ef0e-4e89-9457-37401354a8e9","Type":"ContainerStarted","Data":"e137a30b501156f803311eaa6d17b90f3d610114a7971add24db537517a94c70"} Mar 01 10:14:02 crc kubenswrapper[4792]: I0301 10:14:02.072025 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539334-c4cd2" event={"ID":"807c2da3-ef0e-4e89-9457-37401354a8e9","Type":"ContainerStarted","Data":"b30f740be1f0c17125e3d3e62c51a8c03429e4e9dd6640d5fa13fe74408f6823"} Mar 01 10:14:02 crc kubenswrapper[4792]: I0301 10:14:02.088199 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29539334-c4cd2" podStartSLOduration=1.267282525 podStartE2EDuration="2.088182109s" podCreationTimestamp="2026-03-01 10:14:00 +0000 UTC" firstStartedPulling="2026-03-01 10:14:00.953516524 +0000 UTC m=+3970.195395721" lastFinishedPulling="2026-03-01 10:14:01.774416108 +0000 UTC m=+3971.016295305" observedRunningTime="2026-03-01 10:14:02.083232167 +0000 UTC m=+3971.325111364" watchObservedRunningTime="2026-03-01 10:14:02.088182109 +0000 UTC m=+3971.330061306" Mar 01 10:14:03 crc kubenswrapper[4792]: I0301 10:14:03.081630 4792 generic.go:334] "Generic (PLEG): container finished" podID="807c2da3-ef0e-4e89-9457-37401354a8e9" containerID="b30f740be1f0c17125e3d3e62c51a8c03429e4e9dd6640d5fa13fe74408f6823" exitCode=0 Mar 01 10:14:03 crc kubenswrapper[4792]: I0301 10:14:03.081662 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539334-c4cd2" event={"ID":"807c2da3-ef0e-4e89-9457-37401354a8e9","Type":"ContainerDied","Data":"b30f740be1f0c17125e3d3e62c51a8c03429e4e9dd6640d5fa13fe74408f6823"} Mar 01 10:14:03 crc kubenswrapper[4792]: I0301 10:14:03.408374 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:14:03 crc kubenswrapper[4792]: E0301 10:14:03.408735 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:14:04 crc kubenswrapper[4792]: I0301 10:14:04.470125 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539334-c4cd2" Mar 01 10:14:04 crc kubenswrapper[4792]: I0301 10:14:04.513758 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdzn4\" (UniqueName: \"kubernetes.io/projected/807c2da3-ef0e-4e89-9457-37401354a8e9-kube-api-access-vdzn4\") pod \"807c2da3-ef0e-4e89-9457-37401354a8e9\" (UID: \"807c2da3-ef0e-4e89-9457-37401354a8e9\") " Mar 01 10:14:04 crc kubenswrapper[4792]: I0301 10:14:04.522548 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/807c2da3-ef0e-4e89-9457-37401354a8e9-kube-api-access-vdzn4" (OuterVolumeSpecName: "kube-api-access-vdzn4") pod "807c2da3-ef0e-4e89-9457-37401354a8e9" (UID: "807c2da3-ef0e-4e89-9457-37401354a8e9"). InnerVolumeSpecName "kube-api-access-vdzn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:14:04 crc kubenswrapper[4792]: I0301 10:14:04.615787 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdzn4\" (UniqueName: \"kubernetes.io/projected/807c2da3-ef0e-4e89-9457-37401354a8e9-kube-api-access-vdzn4\") on node \"crc\" DevicePath \"\"" Mar 01 10:14:05 crc kubenswrapper[4792]: I0301 10:14:05.098976 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539334-c4cd2" event={"ID":"807c2da3-ef0e-4e89-9457-37401354a8e9","Type":"ContainerDied","Data":"e137a30b501156f803311eaa6d17b90f3d610114a7971add24db537517a94c70"} Mar 01 10:14:05 crc kubenswrapper[4792]: I0301 10:14:05.099009 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539334-c4cd2" Mar 01 10:14:05 crc kubenswrapper[4792]: I0301 10:14:05.099014 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e137a30b501156f803311eaa6d17b90f3d610114a7971add24db537517a94c70" Mar 01 10:14:05 crc kubenswrapper[4792]: I0301 10:14:05.533378 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539328-6t6hj"] Mar 01 10:14:05 crc kubenswrapper[4792]: I0301 10:14:05.541736 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539328-6t6hj"] Mar 01 10:14:07 crc kubenswrapper[4792]: I0301 10:14:07.420278 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42e71344-31b1-4817-b2e4-dd9aebb9d38e" path="/var/lib/kubelet/pods/42e71344-31b1-4817-b2e4-dd9aebb9d38e/volumes" Mar 01 10:14:18 crc kubenswrapper[4792]: I0301 10:14:18.409419 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:14:18 crc kubenswrapper[4792]: E0301 10:14:18.410062 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:14:31 crc kubenswrapper[4792]: I0301 10:14:31.415781 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:14:31 crc kubenswrapper[4792]: E0301 10:14:31.416447 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:14:33 crc kubenswrapper[4792]: I0301 10:14:33.745008 4792 scope.go:117] "RemoveContainer" containerID="c3e0ad9fbd90b282daf14548418069de81001946b8230adb9c02a7933419e3fe" Mar 01 10:14:43 crc kubenswrapper[4792]: I0301 10:14:43.409155 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:14:43 crc kubenswrapper[4792]: E0301 10:14:43.410053 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:14:55 crc kubenswrapper[4792]: I0301 10:14:55.408870 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:14:55 crc kubenswrapper[4792]: E0301 10:14:55.409764 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:15:00 crc kubenswrapper[4792]: I0301 10:15:00.208282 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5"] Mar 01 10:15:00 crc kubenswrapper[4792]: E0301 10:15:00.209303 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="807c2da3-ef0e-4e89-9457-37401354a8e9" containerName="oc" Mar 01 10:15:00 crc kubenswrapper[4792]: I0301 10:15:00.209321 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="807c2da3-ef0e-4e89-9457-37401354a8e9" containerName="oc" Mar 01 10:15:00 crc kubenswrapper[4792]: I0301 10:15:00.209534 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="807c2da3-ef0e-4e89-9457-37401354a8e9" containerName="oc" Mar 01 10:15:00 crc kubenswrapper[4792]: I0301 10:15:00.210373 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5" Mar 01 10:15:00 crc kubenswrapper[4792]: I0301 10:15:00.224282 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 01 10:15:00 crc kubenswrapper[4792]: I0301 10:15:00.225084 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 01 10:15:00 crc kubenswrapper[4792]: I0301 10:15:00.245874 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5"] Mar 01 10:15:00 crc kubenswrapper[4792]: I0301 10:15:00.311080 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d42ade11-691f-471f-ade0-0d9e12c70d1f-config-volume\") pod \"collect-profiles-29539335-68xb5\" (UID: \"d42ade11-691f-471f-ade0-0d9e12c70d1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5" Mar 01 10:15:00 crc kubenswrapper[4792]: I0301 10:15:00.311153 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d42ade11-691f-471f-ade0-0d9e12c70d1f-secret-volume\") pod \"collect-profiles-29539335-68xb5\" (UID: \"d42ade11-691f-471f-ade0-0d9e12c70d1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5" Mar 01 10:15:00 crc kubenswrapper[4792]: I0301 10:15:00.311171 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmcbv\" (UniqueName: \"kubernetes.io/projected/d42ade11-691f-471f-ade0-0d9e12c70d1f-kube-api-access-qmcbv\") pod \"collect-profiles-29539335-68xb5\" (UID: \"d42ade11-691f-471f-ade0-0d9e12c70d1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5" Mar 01 10:15:00 crc kubenswrapper[4792]: I0301 10:15:00.431482 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d42ade11-691f-471f-ade0-0d9e12c70d1f-config-volume\") pod \"collect-profiles-29539335-68xb5\" (UID: \"d42ade11-691f-471f-ade0-0d9e12c70d1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5" Mar 01 10:15:00 crc kubenswrapper[4792]: I0301 10:15:00.431615 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d42ade11-691f-471f-ade0-0d9e12c70d1f-secret-volume\") pod \"collect-profiles-29539335-68xb5\" (UID: \"d42ade11-691f-471f-ade0-0d9e12c70d1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5" Mar 01 10:15:00 crc kubenswrapper[4792]: I0301 10:15:00.431647 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmcbv\" (UniqueName: \"kubernetes.io/projected/d42ade11-691f-471f-ade0-0d9e12c70d1f-kube-api-access-qmcbv\") pod \"collect-profiles-29539335-68xb5\" (UID: \"d42ade11-691f-471f-ade0-0d9e12c70d1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5" Mar 01 10:15:00 crc kubenswrapper[4792]: I0301 10:15:00.433166 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d42ade11-691f-471f-ade0-0d9e12c70d1f-config-volume\") pod \"collect-profiles-29539335-68xb5\" (UID: \"d42ade11-691f-471f-ade0-0d9e12c70d1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5" Mar 01 10:15:00 crc kubenswrapper[4792]: I0301 10:15:00.475751 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d42ade11-691f-471f-ade0-0d9e12c70d1f-secret-volume\") pod \"collect-profiles-29539335-68xb5\" (UID: \"d42ade11-691f-471f-ade0-0d9e12c70d1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5" Mar 01 10:15:00 crc kubenswrapper[4792]: I0301 10:15:00.503948 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmcbv\" (UniqueName: \"kubernetes.io/projected/d42ade11-691f-471f-ade0-0d9e12c70d1f-kube-api-access-qmcbv\") pod \"collect-profiles-29539335-68xb5\" (UID: \"d42ade11-691f-471f-ade0-0d9e12c70d1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5" Mar 01 10:15:00 crc kubenswrapper[4792]: I0301 10:15:00.550156 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5" Mar 01 10:15:01 crc kubenswrapper[4792]: I0301 10:15:01.096662 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5"] Mar 01 10:15:01 crc kubenswrapper[4792]: I0301 10:15:01.623580 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5" event={"ID":"d42ade11-691f-471f-ade0-0d9e12c70d1f","Type":"ContainerStarted","Data":"889e7f5e3098bfec5bc19d7c90225344ee63adb7281cb525b526611dc8b03fc1"} Mar 01 10:15:01 crc kubenswrapper[4792]: I0301 10:15:01.625035 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5" event={"ID":"d42ade11-691f-471f-ade0-0d9e12c70d1f","Type":"ContainerStarted","Data":"57b5c9ebab15dec960db41e797d308da20fb69805b2d99394a6db01cb3b22ceb"} Mar 01 10:15:01 crc kubenswrapper[4792]: I0301 10:15:01.642117 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5" podStartSLOduration=1.6420983310000001 podStartE2EDuration="1.642098331s" podCreationTimestamp="2026-03-01 10:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 10:15:01.636331638 +0000 UTC m=+4030.878210825" watchObservedRunningTime="2026-03-01 10:15:01.642098331 +0000 UTC m=+4030.883977528" Mar 01 10:15:02 crc kubenswrapper[4792]: I0301 10:15:02.632748 4792 generic.go:334] "Generic (PLEG): container finished" podID="d42ade11-691f-471f-ade0-0d9e12c70d1f" containerID="889e7f5e3098bfec5bc19d7c90225344ee63adb7281cb525b526611dc8b03fc1" exitCode=0 Mar 01 10:15:02 crc kubenswrapper[4792]: I0301 10:15:02.632819 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5" event={"ID":"d42ade11-691f-471f-ade0-0d9e12c70d1f","Type":"ContainerDied","Data":"889e7f5e3098bfec5bc19d7c90225344ee63adb7281cb525b526611dc8b03fc1"} Mar 01 10:15:04 crc kubenswrapper[4792]: I0301 10:15:04.133827 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5" Mar 01 10:15:04 crc kubenswrapper[4792]: I0301 10:15:04.213991 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmcbv\" (UniqueName: \"kubernetes.io/projected/d42ade11-691f-471f-ade0-0d9e12c70d1f-kube-api-access-qmcbv\") pod \"d42ade11-691f-471f-ade0-0d9e12c70d1f\" (UID: \"d42ade11-691f-471f-ade0-0d9e12c70d1f\") " Mar 01 10:15:04 crc kubenswrapper[4792]: I0301 10:15:04.214145 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d42ade11-691f-471f-ade0-0d9e12c70d1f-secret-volume\") pod \"d42ade11-691f-471f-ade0-0d9e12c70d1f\" (UID: \"d42ade11-691f-471f-ade0-0d9e12c70d1f\") " Mar 01 10:15:04 crc kubenswrapper[4792]: I0301 10:15:04.214436 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d42ade11-691f-471f-ade0-0d9e12c70d1f-config-volume\") pod \"d42ade11-691f-471f-ade0-0d9e12c70d1f\" (UID: \"d42ade11-691f-471f-ade0-0d9e12c70d1f\") " Mar 01 10:15:04 crc kubenswrapper[4792]: I0301 10:15:04.215158 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d42ade11-691f-471f-ade0-0d9e12c70d1f-config-volume" (OuterVolumeSpecName: "config-volume") pod "d42ade11-691f-471f-ade0-0d9e12c70d1f" (UID: "d42ade11-691f-471f-ade0-0d9e12c70d1f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:15:04 crc kubenswrapper[4792]: I0301 10:15:04.215826 4792 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d42ade11-691f-471f-ade0-0d9e12c70d1f-config-volume\") on node \"crc\" DevicePath \"\"" Mar 01 10:15:04 crc kubenswrapper[4792]: I0301 10:15:04.224119 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d42ade11-691f-471f-ade0-0d9e12c70d1f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d42ade11-691f-471f-ade0-0d9e12c70d1f" (UID: "d42ade11-691f-471f-ade0-0d9e12c70d1f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:15:04 crc kubenswrapper[4792]: I0301 10:15:04.224628 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d42ade11-691f-471f-ade0-0d9e12c70d1f-kube-api-access-qmcbv" (OuterVolumeSpecName: "kube-api-access-qmcbv") pod "d42ade11-691f-471f-ade0-0d9e12c70d1f" (UID: "d42ade11-691f-471f-ade0-0d9e12c70d1f"). InnerVolumeSpecName "kube-api-access-qmcbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:15:04 crc kubenswrapper[4792]: I0301 10:15:04.319043 4792 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d42ade11-691f-471f-ade0-0d9e12c70d1f-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 01 10:15:04 crc kubenswrapper[4792]: I0301 10:15:04.319090 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmcbv\" (UniqueName: \"kubernetes.io/projected/d42ade11-691f-471f-ade0-0d9e12c70d1f-kube-api-access-qmcbv\") on node \"crc\" DevicePath \"\"" Mar 01 10:15:04 crc kubenswrapper[4792]: I0301 10:15:04.524498 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24"] Mar 01 10:15:04 crc kubenswrapper[4792]: I0301 10:15:04.550505 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539290-wch24"] Mar 01 10:15:04 crc kubenswrapper[4792]: I0301 10:15:04.657595 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5" event={"ID":"d42ade11-691f-471f-ade0-0d9e12c70d1f","Type":"ContainerDied","Data":"57b5c9ebab15dec960db41e797d308da20fb69805b2d99394a6db01cb3b22ceb"} Mar 01 10:15:04 crc kubenswrapper[4792]: I0301 10:15:04.657676 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57b5c9ebab15dec960db41e797d308da20fb69805b2d99394a6db01cb3b22ceb" Mar 01 10:15:04 crc kubenswrapper[4792]: I0301 10:15:04.657720 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539335-68xb5" Mar 01 10:15:05 crc kubenswrapper[4792]: I0301 10:15:05.424037 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29833925-b21b-44d4-954c-e3252e5e69c4" path="/var/lib/kubelet/pods/29833925-b21b-44d4-954c-e3252e5e69c4/volumes" Mar 01 10:15:06 crc kubenswrapper[4792]: I0301 10:15:06.409835 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:15:06 crc kubenswrapper[4792]: E0301 10:15:06.410686 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:15:19 crc kubenswrapper[4792]: I0301 10:15:19.409643 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:15:19 crc kubenswrapper[4792]: E0301 10:15:19.410498 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:15:33 crc kubenswrapper[4792]: I0301 10:15:33.842863 4792 scope.go:117] "RemoveContainer" containerID="2ef9903d192bdc03acaf2fe74facecbf1f886cfe12a9446e1e544403dd9c2365" Mar 01 10:15:34 crc kubenswrapper[4792]: I0301 10:15:34.409651 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:15:34 crc kubenswrapper[4792]: E0301 10:15:34.410157 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:15:34 crc kubenswrapper[4792]: I0301 10:15:34.918196 4792 generic.go:334] "Generic (PLEG): container finished" podID="ee1c75ce-61f7-4ce5-a757-b7405d7135bd" containerID="0b1c921f1338ea9b8f3dd9b08a6d658d40119ab101643c7964b03d38bfa73f47" exitCode=0 Mar 01 10:15:34 crc kubenswrapper[4792]: I0301 10:15:34.918249 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ee1c75ce-61f7-4ce5-a757-b7405d7135bd","Type":"ContainerDied","Data":"0b1c921f1338ea9b8f3dd9b08a6d658d40119ab101643c7964b03d38bfa73f47"} Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.414411 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.603171 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-test-operator-ephemeral-temporary\") pod \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.603222 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-ssh-key\") pod \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.603243 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-config-data\") pod \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.603322 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slzdj\" (UniqueName: \"kubernetes.io/projected/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-kube-api-access-slzdj\") pod \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.603388 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-test-operator-ephemeral-workdir\") pod \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.603485 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-openstack-config-secret\") pod \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.603515 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-openstack-config\") pod \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.603603 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-ca-certs\") pod \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.603655 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\" (UID: \"ee1c75ce-61f7-4ce5-a757-b7405d7135bd\") " Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.606033 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "ee1c75ce-61f7-4ce5-a757-b7405d7135bd" (UID: "ee1c75ce-61f7-4ce5-a757-b7405d7135bd"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.609196 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "ee1c75ce-61f7-4ce5-a757-b7405d7135bd" (UID: "ee1c75ce-61f7-4ce5-a757-b7405d7135bd"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.609505 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "test-operator-logs") pod "ee1c75ce-61f7-4ce5-a757-b7405d7135bd" (UID: "ee1c75ce-61f7-4ce5-a757-b7405d7135bd"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.609970 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-config-data" (OuterVolumeSpecName: "config-data") pod "ee1c75ce-61f7-4ce5-a757-b7405d7135bd" (UID: "ee1c75ce-61f7-4ce5-a757-b7405d7135bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.610354 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-kube-api-access-slzdj" (OuterVolumeSpecName: "kube-api-access-slzdj") pod "ee1c75ce-61f7-4ce5-a757-b7405d7135bd" (UID: "ee1c75ce-61f7-4ce5-a757-b7405d7135bd"). InnerVolumeSpecName "kube-api-access-slzdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.644031 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "ee1c75ce-61f7-4ce5-a757-b7405d7135bd" (UID: "ee1c75ce-61f7-4ce5-a757-b7405d7135bd"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.647371 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ee1c75ce-61f7-4ce5-a757-b7405d7135bd" (UID: "ee1c75ce-61f7-4ce5-a757-b7405d7135bd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.658549 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "ee1c75ce-61f7-4ce5-a757-b7405d7135bd" (UID: "ee1c75ce-61f7-4ce5-a757-b7405d7135bd"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.661045 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "ee1c75ce-61f7-4ce5-a757-b7405d7135bd" (UID: "ee1c75ce-61f7-4ce5-a757-b7405d7135bd"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.706155 4792 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.706704 4792 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.706725 4792 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.706750 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.706766 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-config-data\") on node \"crc\" DevicePath \"\"" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.706779 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slzdj\" (UniqueName: \"kubernetes.io/projected/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-kube-api-access-slzdj\") on node \"crc\" DevicePath \"\"" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.706792 4792 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.706804 4792 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.706817 4792 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ee1c75ce-61f7-4ce5-a757-b7405d7135bd-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.730083 4792 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.808100 4792 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.934025 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ee1c75ce-61f7-4ce5-a757-b7405d7135bd","Type":"ContainerDied","Data":"56c23af9ad5afce4e187f9687e375e2437b0d33726c9ce5a2f51b334bc434432"} Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.934296 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56c23af9ad5afce4e187f9687e375e2437b0d33726c9ce5a2f51b334bc434432" Mar 01 10:15:36 crc kubenswrapper[4792]: I0301 10:15:36.934072 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 01 10:15:46 crc kubenswrapper[4792]: I0301 10:15:46.870256 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 01 10:15:46 crc kubenswrapper[4792]: E0301 10:15:46.871278 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d42ade11-691f-471f-ade0-0d9e12c70d1f" containerName="collect-profiles" Mar 01 10:15:46 crc kubenswrapper[4792]: I0301 10:15:46.871297 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d42ade11-691f-471f-ade0-0d9e12c70d1f" containerName="collect-profiles" Mar 01 10:15:46 crc kubenswrapper[4792]: E0301 10:15:46.871346 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee1c75ce-61f7-4ce5-a757-b7405d7135bd" containerName="tempest-tests-tempest-tests-runner" Mar 01 10:15:46 crc kubenswrapper[4792]: I0301 10:15:46.871356 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee1c75ce-61f7-4ce5-a757-b7405d7135bd" containerName="tempest-tests-tempest-tests-runner" Mar 01 10:15:46 crc kubenswrapper[4792]: I0301 10:15:46.871537 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d42ade11-691f-471f-ade0-0d9e12c70d1f" containerName="collect-profiles" Mar 01 10:15:46 crc kubenswrapper[4792]: I0301 10:15:46.871553 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee1c75ce-61f7-4ce5-a757-b7405d7135bd" containerName="tempest-tests-tempest-tests-runner" Mar 01 10:15:46 crc kubenswrapper[4792]: I0301 10:15:46.872300 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 01 10:15:46 crc kubenswrapper[4792]: I0301 10:15:46.874033 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-nl48r" Mar 01 10:15:46 crc kubenswrapper[4792]: I0301 10:15:46.894829 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 01 10:15:47 crc kubenswrapper[4792]: I0301 10:15:47.007773 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"478d8531-4e8e-4775-999d-42af4afef106\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 01 10:15:47 crc kubenswrapper[4792]: I0301 10:15:47.007931 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl7g4\" (UniqueName: \"kubernetes.io/projected/478d8531-4e8e-4775-999d-42af4afef106-kube-api-access-xl7g4\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"478d8531-4e8e-4775-999d-42af4afef106\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 01 10:15:47 crc kubenswrapper[4792]: I0301 10:15:47.109728 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"478d8531-4e8e-4775-999d-42af4afef106\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 01 10:15:47 crc kubenswrapper[4792]: I0301 10:15:47.109872 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl7g4\" (UniqueName: \"kubernetes.io/projected/478d8531-4e8e-4775-999d-42af4afef106-kube-api-access-xl7g4\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"478d8531-4e8e-4775-999d-42af4afef106\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 01 10:15:47 crc kubenswrapper[4792]: I0301 10:15:47.110978 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"478d8531-4e8e-4775-999d-42af4afef106\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 01 10:15:47 crc kubenswrapper[4792]: I0301 10:15:47.135855 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl7g4\" (UniqueName: \"kubernetes.io/projected/478d8531-4e8e-4775-999d-42af4afef106-kube-api-access-xl7g4\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"478d8531-4e8e-4775-999d-42af4afef106\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 01 10:15:47 crc kubenswrapper[4792]: I0301 10:15:47.139101 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"478d8531-4e8e-4775-999d-42af4afef106\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 01 10:15:47 crc kubenswrapper[4792]: I0301 10:15:47.188020 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 01 10:15:47 crc kubenswrapper[4792]: I0301 10:15:47.695714 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 01 10:15:48 crc kubenswrapper[4792]: I0301 10:15:48.033838 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"478d8531-4e8e-4775-999d-42af4afef106","Type":"ContainerStarted","Data":"6a3ed20822578164d0bed56841e7cdf7da3fe3a1374b8fe4d4a5d98a5767e2c1"} Mar 01 10:15:48 crc kubenswrapper[4792]: I0301 10:15:48.409057 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:15:48 crc kubenswrapper[4792]: E0301 10:15:48.409573 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:15:50 crc kubenswrapper[4792]: I0301 10:15:50.049383 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"478d8531-4e8e-4775-999d-42af4afef106","Type":"ContainerStarted","Data":"3bf971832b4d249de6ae0b44a82cfecf495c4144dfad64713432eabd53146ea2"} Mar 01 10:15:50 crc kubenswrapper[4792]: I0301 10:15:50.065923 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.785206419 podStartE2EDuration="4.065887807s" podCreationTimestamp="2026-03-01 10:15:46 +0000 UTC" firstStartedPulling="2026-03-01 10:15:47.680799497 +0000 UTC m=+4076.922678694" lastFinishedPulling="2026-03-01 10:15:48.961480885 +0000 UTC m=+4078.203360082" observedRunningTime="2026-03-01 10:15:50.063880647 +0000 UTC m=+4079.305759844" watchObservedRunningTime="2026-03-01 10:15:50.065887807 +0000 UTC m=+4079.307767004" Mar 01 10:16:00 crc kubenswrapper[4792]: I0301 10:16:00.143576 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539336-vnhz7"] Mar 01 10:16:00 crc kubenswrapper[4792]: I0301 10:16:00.145420 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539336-vnhz7" Mar 01 10:16:00 crc kubenswrapper[4792]: I0301 10:16:00.147941 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:16:00 crc kubenswrapper[4792]: I0301 10:16:00.148022 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:16:00 crc kubenswrapper[4792]: I0301 10:16:00.148404 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:16:00 crc kubenswrapper[4792]: I0301 10:16:00.152955 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539336-vnhz7"] Mar 01 10:16:00 crc kubenswrapper[4792]: I0301 10:16:00.310604 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkklg\" (UniqueName: \"kubernetes.io/projected/731500c7-53e0-431a-9ec7-7e56ef9c11ee-kube-api-access-bkklg\") pod \"auto-csr-approver-29539336-vnhz7\" (UID: \"731500c7-53e0-431a-9ec7-7e56ef9c11ee\") " pod="openshift-infra/auto-csr-approver-29539336-vnhz7" Mar 01 10:16:00 crc kubenswrapper[4792]: I0301 10:16:00.412814 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkklg\" (UniqueName: \"kubernetes.io/projected/731500c7-53e0-431a-9ec7-7e56ef9c11ee-kube-api-access-bkklg\") pod \"auto-csr-approver-29539336-vnhz7\" (UID: \"731500c7-53e0-431a-9ec7-7e56ef9c11ee\") " pod="openshift-infra/auto-csr-approver-29539336-vnhz7" Mar 01 10:16:00 crc kubenswrapper[4792]: I0301 10:16:00.438093 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkklg\" (UniqueName: \"kubernetes.io/projected/731500c7-53e0-431a-9ec7-7e56ef9c11ee-kube-api-access-bkklg\") pod \"auto-csr-approver-29539336-vnhz7\" (UID: \"731500c7-53e0-431a-9ec7-7e56ef9c11ee\") " pod="openshift-infra/auto-csr-approver-29539336-vnhz7" Mar 01 10:16:00 crc kubenswrapper[4792]: I0301 10:16:00.470056 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539336-vnhz7" Mar 01 10:16:00 crc kubenswrapper[4792]: I0301 10:16:00.939538 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539336-vnhz7"] Mar 01 10:16:01 crc kubenswrapper[4792]: I0301 10:16:01.176632 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539336-vnhz7" event={"ID":"731500c7-53e0-431a-9ec7-7e56ef9c11ee","Type":"ContainerStarted","Data":"f0a611fca5aa0d9b839a3b52b4e4a4b3d492bced131624b7eed8f41ebfde0b32"} Mar 01 10:16:02 crc kubenswrapper[4792]: I0301 10:16:02.185031 4792 generic.go:334] "Generic (PLEG): container finished" podID="731500c7-53e0-431a-9ec7-7e56ef9c11ee" containerID="457400b5efe76b08cc35b2db4e3b43b3bb9dd95107e6fe2815b55395cf597f33" exitCode=0 Mar 01 10:16:02 crc kubenswrapper[4792]: I0301 10:16:02.185083 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539336-vnhz7" event={"ID":"731500c7-53e0-431a-9ec7-7e56ef9c11ee","Type":"ContainerDied","Data":"457400b5efe76b08cc35b2db4e3b43b3bb9dd95107e6fe2815b55395cf597f33"} Mar 01 10:16:03 crc kubenswrapper[4792]: I0301 10:16:03.411477 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:16:03 crc kubenswrapper[4792]: E0301 10:16:03.412307 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:16:03 crc kubenswrapper[4792]: I0301 10:16:03.683522 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539336-vnhz7" Mar 01 10:16:03 crc kubenswrapper[4792]: I0301 10:16:03.803460 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkklg\" (UniqueName: \"kubernetes.io/projected/731500c7-53e0-431a-9ec7-7e56ef9c11ee-kube-api-access-bkklg\") pod \"731500c7-53e0-431a-9ec7-7e56ef9c11ee\" (UID: \"731500c7-53e0-431a-9ec7-7e56ef9c11ee\") " Mar 01 10:16:03 crc kubenswrapper[4792]: I0301 10:16:03.808252 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/731500c7-53e0-431a-9ec7-7e56ef9c11ee-kube-api-access-bkklg" (OuterVolumeSpecName: "kube-api-access-bkklg") pod "731500c7-53e0-431a-9ec7-7e56ef9c11ee" (UID: "731500c7-53e0-431a-9ec7-7e56ef9c11ee"). InnerVolumeSpecName "kube-api-access-bkklg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:16:03 crc kubenswrapper[4792]: I0301 10:16:03.907374 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkklg\" (UniqueName: \"kubernetes.io/projected/731500c7-53e0-431a-9ec7-7e56ef9c11ee-kube-api-access-bkklg\") on node \"crc\" DevicePath \"\"" Mar 01 10:16:04 crc kubenswrapper[4792]: I0301 10:16:04.203149 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539336-vnhz7" event={"ID":"731500c7-53e0-431a-9ec7-7e56ef9c11ee","Type":"ContainerDied","Data":"f0a611fca5aa0d9b839a3b52b4e4a4b3d492bced131624b7eed8f41ebfde0b32"} Mar 01 10:16:04 crc kubenswrapper[4792]: I0301 10:16:04.203200 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0a611fca5aa0d9b839a3b52b4e4a4b3d492bced131624b7eed8f41ebfde0b32" Mar 01 10:16:04 crc kubenswrapper[4792]: I0301 10:16:04.203201 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539336-vnhz7" Mar 01 10:16:04 crc kubenswrapper[4792]: I0301 10:16:04.759004 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539330-wqct7"] Mar 01 10:16:04 crc kubenswrapper[4792]: I0301 10:16:04.766580 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539330-wqct7"] Mar 01 10:16:05 crc kubenswrapper[4792]: I0301 10:16:05.421575 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9025376-622f-4b7e-94d5-c5b136e139d8" path="/var/lib/kubelet/pods/a9025376-622f-4b7e-94d5-c5b136e139d8/volumes" Mar 01 10:16:11 crc kubenswrapper[4792]: I0301 10:16:11.491296 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8szkd/must-gather-5ntfl"] Mar 01 10:16:11 crc kubenswrapper[4792]: E0301 10:16:11.492140 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="731500c7-53e0-431a-9ec7-7e56ef9c11ee" containerName="oc" Mar 01 10:16:11 crc kubenswrapper[4792]: I0301 10:16:11.492151 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="731500c7-53e0-431a-9ec7-7e56ef9c11ee" containerName="oc" Mar 01 10:16:11 crc kubenswrapper[4792]: I0301 10:16:11.492325 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="731500c7-53e0-431a-9ec7-7e56ef9c11ee" containerName="oc" Mar 01 10:16:11 crc kubenswrapper[4792]: I0301 10:16:11.493713 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8szkd/must-gather-5ntfl" Mar 01 10:16:11 crc kubenswrapper[4792]: I0301 10:16:11.496394 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-8szkd"/"default-dockercfg-sh7xn" Mar 01 10:16:11 crc kubenswrapper[4792]: I0301 10:16:11.496502 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8szkd"/"openshift-service-ca.crt" Mar 01 10:16:11 crc kubenswrapper[4792]: I0301 10:16:11.496654 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8szkd"/"kube-root-ca.crt" Mar 01 10:16:11 crc kubenswrapper[4792]: I0301 10:16:11.519168 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8szkd/must-gather-5ntfl"] Mar 01 10:16:11 crc kubenswrapper[4792]: I0301 10:16:11.661213 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58lp7\" (UniqueName: \"kubernetes.io/projected/c72d6020-9460-4198-863a-ec32bc90fee9-kube-api-access-58lp7\") pod \"must-gather-5ntfl\" (UID: \"c72d6020-9460-4198-863a-ec32bc90fee9\") " pod="openshift-must-gather-8szkd/must-gather-5ntfl" Mar 01 10:16:11 crc kubenswrapper[4792]: I0301 10:16:11.661321 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c72d6020-9460-4198-863a-ec32bc90fee9-must-gather-output\") pod \"must-gather-5ntfl\" (UID: \"c72d6020-9460-4198-863a-ec32bc90fee9\") " pod="openshift-must-gather-8szkd/must-gather-5ntfl" Mar 01 10:16:11 crc kubenswrapper[4792]: I0301 10:16:11.762883 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c72d6020-9460-4198-863a-ec32bc90fee9-must-gather-output\") pod \"must-gather-5ntfl\" (UID: \"c72d6020-9460-4198-863a-ec32bc90fee9\") " pod="openshift-must-gather-8szkd/must-gather-5ntfl" Mar 01 10:16:11 crc kubenswrapper[4792]: I0301 10:16:11.763247 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c72d6020-9460-4198-863a-ec32bc90fee9-must-gather-output\") pod \"must-gather-5ntfl\" (UID: \"c72d6020-9460-4198-863a-ec32bc90fee9\") " pod="openshift-must-gather-8szkd/must-gather-5ntfl" Mar 01 10:16:11 crc kubenswrapper[4792]: I0301 10:16:11.763531 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58lp7\" (UniqueName: \"kubernetes.io/projected/c72d6020-9460-4198-863a-ec32bc90fee9-kube-api-access-58lp7\") pod \"must-gather-5ntfl\" (UID: \"c72d6020-9460-4198-863a-ec32bc90fee9\") " pod="openshift-must-gather-8szkd/must-gather-5ntfl" Mar 01 10:16:11 crc kubenswrapper[4792]: I0301 10:16:11.781400 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58lp7\" (UniqueName: \"kubernetes.io/projected/c72d6020-9460-4198-863a-ec32bc90fee9-kube-api-access-58lp7\") pod \"must-gather-5ntfl\" (UID: \"c72d6020-9460-4198-863a-ec32bc90fee9\") " pod="openshift-must-gather-8szkd/must-gather-5ntfl" Mar 01 10:16:11 crc kubenswrapper[4792]: I0301 10:16:11.813423 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8szkd/must-gather-5ntfl" Mar 01 10:16:12 crc kubenswrapper[4792]: I0301 10:16:12.333360 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8szkd/must-gather-5ntfl"] Mar 01 10:16:13 crc kubenswrapper[4792]: I0301 10:16:13.283292 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8szkd/must-gather-5ntfl" event={"ID":"c72d6020-9460-4198-863a-ec32bc90fee9","Type":"ContainerStarted","Data":"740afa23c87b88c924b6a67375351371f8d7a97fff51d47250af7695c11c9757"} Mar 01 10:16:17 crc kubenswrapper[4792]: I0301 10:16:17.412311 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:16:17 crc kubenswrapper[4792]: E0301 10:16:17.413168 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:16:19 crc kubenswrapper[4792]: I0301 10:16:19.607405 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6z92d"] Mar 01 10:16:19 crc kubenswrapper[4792]: I0301 10:16:19.610547 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6z92d" Mar 01 10:16:19 crc kubenswrapper[4792]: I0301 10:16:19.616300 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6z92d"] Mar 01 10:16:19 crc kubenswrapper[4792]: I0301 10:16:19.679388 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f-catalog-content\") pod \"redhat-operators-6z92d\" (UID: \"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f\") " pod="openshift-marketplace/redhat-operators-6z92d" Mar 01 10:16:19 crc kubenswrapper[4792]: I0301 10:16:19.679626 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drslj\" (UniqueName: \"kubernetes.io/projected/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f-kube-api-access-drslj\") pod \"redhat-operators-6z92d\" (UID: \"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f\") " pod="openshift-marketplace/redhat-operators-6z92d" Mar 01 10:16:19 crc kubenswrapper[4792]: I0301 10:16:19.679985 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f-utilities\") pod \"redhat-operators-6z92d\" (UID: \"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f\") " pod="openshift-marketplace/redhat-operators-6z92d" Mar 01 10:16:19 crc kubenswrapper[4792]: I0301 10:16:19.782318 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f-utilities\") pod \"redhat-operators-6z92d\" (UID: \"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f\") " pod="openshift-marketplace/redhat-operators-6z92d" Mar 01 10:16:19 crc kubenswrapper[4792]: I0301 10:16:19.782649 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f-catalog-content\") pod \"redhat-operators-6z92d\" (UID: \"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f\") " pod="openshift-marketplace/redhat-operators-6z92d" Mar 01 10:16:19 crc kubenswrapper[4792]: I0301 10:16:19.782761 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drslj\" (UniqueName: \"kubernetes.io/projected/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f-kube-api-access-drslj\") pod \"redhat-operators-6z92d\" (UID: \"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f\") " pod="openshift-marketplace/redhat-operators-6z92d" Mar 01 10:16:19 crc kubenswrapper[4792]: I0301 10:16:19.782803 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f-utilities\") pod \"redhat-operators-6z92d\" (UID: \"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f\") " pod="openshift-marketplace/redhat-operators-6z92d" Mar 01 10:16:19 crc kubenswrapper[4792]: I0301 10:16:19.783090 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f-catalog-content\") pod \"redhat-operators-6z92d\" (UID: \"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f\") " pod="openshift-marketplace/redhat-operators-6z92d" Mar 01 10:16:19 crc kubenswrapper[4792]: I0301 10:16:19.810630 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drslj\" (UniqueName: \"kubernetes.io/projected/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f-kube-api-access-drslj\") pod \"redhat-operators-6z92d\" (UID: \"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f\") " pod="openshift-marketplace/redhat-operators-6z92d" Mar 01 10:16:19 crc kubenswrapper[4792]: I0301 10:16:19.947261 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6z92d" Mar 01 10:16:20 crc kubenswrapper[4792]: I0301 10:16:20.422109 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8szkd/must-gather-5ntfl" event={"ID":"c72d6020-9460-4198-863a-ec32bc90fee9","Type":"ContainerStarted","Data":"882ed7a11f9213b4a2466a64a60d624aeb96a25ba7c34fc56d839ecd11da0b1a"} Mar 01 10:16:20 crc kubenswrapper[4792]: I0301 10:16:20.422469 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8szkd/must-gather-5ntfl" event={"ID":"c72d6020-9460-4198-863a-ec32bc90fee9","Type":"ContainerStarted","Data":"3040d0fb9fd765268819a187e562939355666f759f1f77a68a46d8ecca69408c"} Mar 01 10:16:20 crc kubenswrapper[4792]: I0301 10:16:20.469525 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8szkd/must-gather-5ntfl" podStartSLOduration=2.323804022 podStartE2EDuration="9.469507181s" podCreationTimestamp="2026-03-01 10:16:11 +0000 UTC" firstStartedPulling="2026-03-01 10:16:12.335162831 +0000 UTC m=+4101.577042028" lastFinishedPulling="2026-03-01 10:16:19.48086599 +0000 UTC m=+4108.722745187" observedRunningTime="2026-03-01 10:16:20.445403702 +0000 UTC m=+4109.687282899" watchObservedRunningTime="2026-03-01 10:16:20.469507181 +0000 UTC m=+4109.711386378" Mar 01 10:16:20 crc kubenswrapper[4792]: I0301 10:16:20.499038 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6z92d"] Mar 01 10:16:21 crc kubenswrapper[4792]: I0301 10:16:21.453329 4792 generic.go:334] "Generic (PLEG): container finished" podID="7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" containerID="12e62220e8f801f41c466262887e7df524da2d6dadb151112dbd3c99a23c9b8e" exitCode=0 Mar 01 10:16:21 crc kubenswrapper[4792]: I0301 10:16:21.454488 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6z92d" event={"ID":"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f","Type":"ContainerDied","Data":"12e62220e8f801f41c466262887e7df524da2d6dadb151112dbd3c99a23c9b8e"} Mar 01 10:16:21 crc kubenswrapper[4792]: I0301 10:16:21.454514 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6z92d" event={"ID":"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f","Type":"ContainerStarted","Data":"9653c39b1fc9d37ab3fa3dc401ab1732f3efcb27add0085798088c83918fe421"} Mar 01 10:16:22 crc kubenswrapper[4792]: I0301 10:16:22.467370 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6z92d" event={"ID":"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f","Type":"ContainerStarted","Data":"e3667961a693dba37bb04e11233051f1743a9f4f35ff5d1573d0a8ac7aed1fd7"} Mar 01 10:16:25 crc kubenswrapper[4792]: E0301 10:16:25.358837 4792 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.89:37230->38.102.83.89:34111: write tcp 38.102.83.89:37230->38.102.83.89:34111: write: broken pipe Mar 01 10:16:25 crc kubenswrapper[4792]: E0301 10:16:25.512669 4792 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.89:37276->38.102.83.89:34111: write tcp 38.102.83.89:37276->38.102.83.89:34111: write: connection reset by peer Mar 01 10:16:27 crc kubenswrapper[4792]: I0301 10:16:27.188635 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8szkd/crc-debug-zgt6k"] Mar 01 10:16:27 crc kubenswrapper[4792]: I0301 10:16:27.190469 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8szkd/crc-debug-zgt6k" Mar 01 10:16:27 crc kubenswrapper[4792]: I0301 10:16:27.246723 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/21e1cd52-26d9-4c5d-b87b-b124e2b7cb06-host\") pod \"crc-debug-zgt6k\" (UID: \"21e1cd52-26d9-4c5d-b87b-b124e2b7cb06\") " pod="openshift-must-gather-8szkd/crc-debug-zgt6k" Mar 01 10:16:27 crc kubenswrapper[4792]: I0301 10:16:27.246850 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nbbf\" (UniqueName: \"kubernetes.io/projected/21e1cd52-26d9-4c5d-b87b-b124e2b7cb06-kube-api-access-9nbbf\") pod \"crc-debug-zgt6k\" (UID: \"21e1cd52-26d9-4c5d-b87b-b124e2b7cb06\") " pod="openshift-must-gather-8szkd/crc-debug-zgt6k" Mar 01 10:16:27 crc kubenswrapper[4792]: I0301 10:16:27.349360 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/21e1cd52-26d9-4c5d-b87b-b124e2b7cb06-host\") pod \"crc-debug-zgt6k\" (UID: \"21e1cd52-26d9-4c5d-b87b-b124e2b7cb06\") " pod="openshift-must-gather-8szkd/crc-debug-zgt6k" Mar 01 10:16:27 crc kubenswrapper[4792]: I0301 10:16:27.349507 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/21e1cd52-26d9-4c5d-b87b-b124e2b7cb06-host\") pod \"crc-debug-zgt6k\" (UID: \"21e1cd52-26d9-4c5d-b87b-b124e2b7cb06\") " pod="openshift-must-gather-8szkd/crc-debug-zgt6k" Mar 01 10:16:27 crc kubenswrapper[4792]: I0301 10:16:27.349757 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nbbf\" (UniqueName: \"kubernetes.io/projected/21e1cd52-26d9-4c5d-b87b-b124e2b7cb06-kube-api-access-9nbbf\") pod \"crc-debug-zgt6k\" (UID: \"21e1cd52-26d9-4c5d-b87b-b124e2b7cb06\") " pod="openshift-must-gather-8szkd/crc-debug-zgt6k" Mar 01 10:16:27 crc kubenswrapper[4792]: I0301 10:16:27.371176 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nbbf\" (UniqueName: \"kubernetes.io/projected/21e1cd52-26d9-4c5d-b87b-b124e2b7cb06-kube-api-access-9nbbf\") pod \"crc-debug-zgt6k\" (UID: \"21e1cd52-26d9-4c5d-b87b-b124e2b7cb06\") " pod="openshift-must-gather-8szkd/crc-debug-zgt6k" Mar 01 10:16:27 crc kubenswrapper[4792]: I0301 10:16:27.506856 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8szkd/crc-debug-zgt6k" Mar 01 10:16:27 crc kubenswrapper[4792]: I0301 10:16:27.512624 4792 generic.go:334] "Generic (PLEG): container finished" podID="7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" containerID="e3667961a693dba37bb04e11233051f1743a9f4f35ff5d1573d0a8ac7aed1fd7" exitCode=0 Mar 01 10:16:27 crc kubenswrapper[4792]: I0301 10:16:27.512696 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6z92d" event={"ID":"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f","Type":"ContainerDied","Data":"e3667961a693dba37bb04e11233051f1743a9f4f35ff5d1573d0a8ac7aed1fd7"} Mar 01 10:16:28 crc kubenswrapper[4792]: I0301 10:16:28.410009 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:16:28 crc kubenswrapper[4792]: E0301 10:16:28.410659 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:16:28 crc kubenswrapper[4792]: I0301 10:16:28.531120 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6z92d" event={"ID":"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f","Type":"ContainerStarted","Data":"a190ba791fc79c3bb1e89f0f4c912141789982112b6c4f33837ef401b1d2da09"} Mar 01 10:16:28 crc kubenswrapper[4792]: I0301 10:16:28.533275 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8szkd/crc-debug-zgt6k" event={"ID":"21e1cd52-26d9-4c5d-b87b-b124e2b7cb06","Type":"ContainerStarted","Data":"434542fb734e119099c99109e2963f44067d819ceef6faa0696fa255b4e3073f"} Mar 01 10:16:28 crc kubenswrapper[4792]: I0301 10:16:28.557439 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6z92d" podStartSLOduration=3.06851316 podStartE2EDuration="9.557419906s" podCreationTimestamp="2026-03-01 10:16:19 +0000 UTC" firstStartedPulling="2026-03-01 10:16:21.459101366 +0000 UTC m=+4110.700980563" lastFinishedPulling="2026-03-01 10:16:27.948008112 +0000 UTC m=+4117.189887309" observedRunningTime="2026-03-01 10:16:28.54510314 +0000 UTC m=+4117.786982337" watchObservedRunningTime="2026-03-01 10:16:28.557419906 +0000 UTC m=+4117.799299103" Mar 01 10:16:29 crc kubenswrapper[4792]: I0301 10:16:29.947773 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6z92d" Mar 01 10:16:29 crc kubenswrapper[4792]: I0301 10:16:29.948093 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6z92d" Mar 01 10:16:31 crc kubenswrapper[4792]: I0301 10:16:31.000702 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6z92d" podUID="7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" containerName="registry-server" probeResult="failure" output=< Mar 01 10:16:31 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 10:16:31 crc kubenswrapper[4792]: > Mar 01 10:16:33 crc kubenswrapper[4792]: I0301 10:16:33.936096 4792 scope.go:117] "RemoveContainer" containerID="efa17f624abb4097f736a6d5cc9c3131b915c07d36d54a5857fd159d31a5a613" Mar 01 10:16:40 crc kubenswrapper[4792]: I0301 10:16:40.409548 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:16:40 crc kubenswrapper[4792]: E0301 10:16:40.410354 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:16:40 crc kubenswrapper[4792]: I0301 10:16:40.653279 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8szkd/crc-debug-zgt6k" event={"ID":"21e1cd52-26d9-4c5d-b87b-b124e2b7cb06","Type":"ContainerStarted","Data":"12a4ef8d3fa3b5f92a7e689e1d88b84e48e9aed058fb8fe4d0d9b00831ceeea9"} Mar 01 10:16:40 crc kubenswrapper[4792]: I0301 10:16:40.671610 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8szkd/crc-debug-zgt6k" podStartSLOduration=1.830996534 podStartE2EDuration="13.671593608s" podCreationTimestamp="2026-03-01 10:16:27 +0000 UTC" firstStartedPulling="2026-03-01 10:16:27.572539669 +0000 UTC m=+4116.814418866" lastFinishedPulling="2026-03-01 10:16:39.413136743 +0000 UTC m=+4128.655015940" observedRunningTime="2026-03-01 10:16:40.669212188 +0000 UTC m=+4129.911091385" watchObservedRunningTime="2026-03-01 10:16:40.671593608 +0000 UTC m=+4129.913472805" Mar 01 10:16:41 crc kubenswrapper[4792]: I0301 10:16:41.122163 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6z92d" podUID="7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" containerName="registry-server" probeResult="failure" output=< Mar 01 10:16:41 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 10:16:41 crc kubenswrapper[4792]: > Mar 01 10:16:50 crc kubenswrapper[4792]: I0301 10:16:50.997194 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6z92d" podUID="7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" containerName="registry-server" probeResult="failure" output=< Mar 01 10:16:50 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 10:16:50 crc kubenswrapper[4792]: > Mar 01 10:16:53 crc kubenswrapper[4792]: I0301 10:16:53.408801 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:16:53 crc kubenswrapper[4792]: E0301 10:16:53.409762 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:17:01 crc kubenswrapper[4792]: I0301 10:17:01.011196 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6z92d" podUID="7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" containerName="registry-server" probeResult="failure" output=< Mar 01 10:17:01 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 10:17:01 crc kubenswrapper[4792]: > Mar 01 10:17:04 crc kubenswrapper[4792]: I0301 10:17:04.409194 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:17:04 crc kubenswrapper[4792]: E0301 10:17:04.409953 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:17:09 crc kubenswrapper[4792]: I0301 10:17:09.996141 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6z92d" Mar 01 10:17:10 crc kubenswrapper[4792]: I0301 10:17:10.051087 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6z92d" Mar 01 10:17:10 crc kubenswrapper[4792]: I0301 10:17:10.235070 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6z92d"] Mar 01 10:17:11 crc kubenswrapper[4792]: I0301 10:17:11.402220 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6z92d" podUID="7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" containerName="registry-server" containerID="cri-o://a190ba791fc79c3bb1e89f0f4c912141789982112b6c4f33837ef401b1d2da09" gracePeriod=2 Mar 01 10:17:11 crc kubenswrapper[4792]: I0301 10:17:11.988467 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6z92d" Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.149510 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f-utilities\") pod \"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f\" (UID: \"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f\") " Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.149565 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f-catalog-content\") pod \"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f\" (UID: \"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f\") " Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.149737 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drslj\" (UniqueName: \"kubernetes.io/projected/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f-kube-api-access-drslj\") pod \"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f\" (UID: \"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f\") " Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.150364 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f-utilities" (OuterVolumeSpecName: "utilities") pod "7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" (UID: "7fb9c2d9-68e1-4fca-9749-7abfe0778f3f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.162548 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f-kube-api-access-drslj" (OuterVolumeSpecName: "kube-api-access-drslj") pod "7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" (UID: "7fb9c2d9-68e1-4fca-9749-7abfe0778f3f"). InnerVolumeSpecName "kube-api-access-drslj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.252405 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drslj\" (UniqueName: \"kubernetes.io/projected/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f-kube-api-access-drslj\") on node \"crc\" DevicePath \"\"" Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.252437 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.279892 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" (UID: "7fb9c2d9-68e1-4fca-9749-7abfe0778f3f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.354141 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.411263 4792 generic.go:334] "Generic (PLEG): container finished" podID="7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" containerID="a190ba791fc79c3bb1e89f0f4c912141789982112b6c4f33837ef401b1d2da09" exitCode=0 Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.411307 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6z92d" event={"ID":"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f","Type":"ContainerDied","Data":"a190ba791fc79c3bb1e89f0f4c912141789982112b6c4f33837ef401b1d2da09"} Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.411344 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6z92d" event={"ID":"7fb9c2d9-68e1-4fca-9749-7abfe0778f3f","Type":"ContainerDied","Data":"9653c39b1fc9d37ab3fa3dc401ab1732f3efcb27add0085798088c83918fe421"} Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.411344 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6z92d" Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.411361 4792 scope.go:117] "RemoveContainer" containerID="a190ba791fc79c3bb1e89f0f4c912141789982112b6c4f33837ef401b1d2da09" Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.431111 4792 scope.go:117] "RemoveContainer" containerID="e3667961a693dba37bb04e11233051f1743a9f4f35ff5d1573d0a8ac7aed1fd7" Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.474988 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6z92d"] Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.485460 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6z92d"] Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.488165 4792 scope.go:117] "RemoveContainer" containerID="12e62220e8f801f41c466262887e7df524da2d6dadb151112dbd3c99a23c9b8e" Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.510868 4792 scope.go:117] "RemoveContainer" containerID="a190ba791fc79c3bb1e89f0f4c912141789982112b6c4f33837ef401b1d2da09" Mar 01 10:17:12 crc kubenswrapper[4792]: E0301 10:17:12.511528 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a190ba791fc79c3bb1e89f0f4c912141789982112b6c4f33837ef401b1d2da09\": container with ID starting with a190ba791fc79c3bb1e89f0f4c912141789982112b6c4f33837ef401b1d2da09 not found: ID does not exist" containerID="a190ba791fc79c3bb1e89f0f4c912141789982112b6c4f33837ef401b1d2da09" Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.511558 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a190ba791fc79c3bb1e89f0f4c912141789982112b6c4f33837ef401b1d2da09"} err="failed to get container status \"a190ba791fc79c3bb1e89f0f4c912141789982112b6c4f33837ef401b1d2da09\": rpc error: code = NotFound desc = could not find container \"a190ba791fc79c3bb1e89f0f4c912141789982112b6c4f33837ef401b1d2da09\": container with ID starting with a190ba791fc79c3bb1e89f0f4c912141789982112b6c4f33837ef401b1d2da09 not found: ID does not exist" Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.511580 4792 scope.go:117] "RemoveContainer" containerID="e3667961a693dba37bb04e11233051f1743a9f4f35ff5d1573d0a8ac7aed1fd7" Mar 01 10:17:12 crc kubenswrapper[4792]: E0301 10:17:12.513312 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3667961a693dba37bb04e11233051f1743a9f4f35ff5d1573d0a8ac7aed1fd7\": container with ID starting with e3667961a693dba37bb04e11233051f1743a9f4f35ff5d1573d0a8ac7aed1fd7 not found: ID does not exist" containerID="e3667961a693dba37bb04e11233051f1743a9f4f35ff5d1573d0a8ac7aed1fd7" Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.513355 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3667961a693dba37bb04e11233051f1743a9f4f35ff5d1573d0a8ac7aed1fd7"} err="failed to get container status \"e3667961a693dba37bb04e11233051f1743a9f4f35ff5d1573d0a8ac7aed1fd7\": rpc error: code = NotFound desc = could not find container \"e3667961a693dba37bb04e11233051f1743a9f4f35ff5d1573d0a8ac7aed1fd7\": container with ID starting with e3667961a693dba37bb04e11233051f1743a9f4f35ff5d1573d0a8ac7aed1fd7 not found: ID does not exist" Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.513383 4792 scope.go:117] "RemoveContainer" containerID="12e62220e8f801f41c466262887e7df524da2d6dadb151112dbd3c99a23c9b8e" Mar 01 10:17:12 crc kubenswrapper[4792]: E0301 10:17:12.514206 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12e62220e8f801f41c466262887e7df524da2d6dadb151112dbd3c99a23c9b8e\": container with ID starting with 12e62220e8f801f41c466262887e7df524da2d6dadb151112dbd3c99a23c9b8e not found: ID does not exist" containerID="12e62220e8f801f41c466262887e7df524da2d6dadb151112dbd3c99a23c9b8e" Mar 01 10:17:12 crc kubenswrapper[4792]: I0301 10:17:12.514234 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12e62220e8f801f41c466262887e7df524da2d6dadb151112dbd3c99a23c9b8e"} err="failed to get container status \"12e62220e8f801f41c466262887e7df524da2d6dadb151112dbd3c99a23c9b8e\": rpc error: code = NotFound desc = could not find container \"12e62220e8f801f41c466262887e7df524da2d6dadb151112dbd3c99a23c9b8e\": container with ID starting with 12e62220e8f801f41c466262887e7df524da2d6dadb151112dbd3c99a23c9b8e not found: ID does not exist" Mar 01 10:17:13 crc kubenswrapper[4792]: I0301 10:17:13.422418 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" path="/var/lib/kubelet/pods/7fb9c2d9-68e1-4fca-9749-7abfe0778f3f/volumes" Mar 01 10:17:15 crc kubenswrapper[4792]: I0301 10:17:15.409831 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:17:15 crc kubenswrapper[4792]: E0301 10:17:15.410734 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:17:20 crc kubenswrapper[4792]: I0301 10:17:20.475593 4792 generic.go:334] "Generic (PLEG): container finished" podID="21e1cd52-26d9-4c5d-b87b-b124e2b7cb06" containerID="12a4ef8d3fa3b5f92a7e689e1d88b84e48e9aed058fb8fe4d0d9b00831ceeea9" exitCode=0 Mar 01 10:17:20 crc kubenswrapper[4792]: I0301 10:17:20.475813 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8szkd/crc-debug-zgt6k" event={"ID":"21e1cd52-26d9-4c5d-b87b-b124e2b7cb06","Type":"ContainerDied","Data":"12a4ef8d3fa3b5f92a7e689e1d88b84e48e9aed058fb8fe4d0d9b00831ceeea9"} Mar 01 10:17:21 crc kubenswrapper[4792]: I0301 10:17:21.663228 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8szkd/crc-debug-zgt6k" Mar 01 10:17:21 crc kubenswrapper[4792]: I0301 10:17:21.744216 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nbbf\" (UniqueName: \"kubernetes.io/projected/21e1cd52-26d9-4c5d-b87b-b124e2b7cb06-kube-api-access-9nbbf\") pod \"21e1cd52-26d9-4c5d-b87b-b124e2b7cb06\" (UID: \"21e1cd52-26d9-4c5d-b87b-b124e2b7cb06\") " Mar 01 10:17:21 crc kubenswrapper[4792]: I0301 10:17:21.744269 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/21e1cd52-26d9-4c5d-b87b-b124e2b7cb06-host\") pod \"21e1cd52-26d9-4c5d-b87b-b124e2b7cb06\" (UID: \"21e1cd52-26d9-4c5d-b87b-b124e2b7cb06\") " Mar 01 10:17:21 crc kubenswrapper[4792]: I0301 10:17:21.744418 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21e1cd52-26d9-4c5d-b87b-b124e2b7cb06-host" (OuterVolumeSpecName: "host") pod "21e1cd52-26d9-4c5d-b87b-b124e2b7cb06" (UID: "21e1cd52-26d9-4c5d-b87b-b124e2b7cb06"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 10:17:21 crc kubenswrapper[4792]: I0301 10:17:21.744719 4792 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/21e1cd52-26d9-4c5d-b87b-b124e2b7cb06-host\") on node \"crc\" DevicePath \"\"" Mar 01 10:17:21 crc kubenswrapper[4792]: I0301 10:17:21.752165 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21e1cd52-26d9-4c5d-b87b-b124e2b7cb06-kube-api-access-9nbbf" (OuterVolumeSpecName: "kube-api-access-9nbbf") pod "21e1cd52-26d9-4c5d-b87b-b124e2b7cb06" (UID: "21e1cd52-26d9-4c5d-b87b-b124e2b7cb06"). InnerVolumeSpecName "kube-api-access-9nbbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:17:21 crc kubenswrapper[4792]: I0301 10:17:21.782194 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8szkd/crc-debug-zgt6k"] Mar 01 10:17:21 crc kubenswrapper[4792]: I0301 10:17:21.798355 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8szkd/crc-debug-zgt6k"] Mar 01 10:17:21 crc kubenswrapper[4792]: I0301 10:17:21.847003 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nbbf\" (UniqueName: \"kubernetes.io/projected/21e1cd52-26d9-4c5d-b87b-b124e2b7cb06-kube-api-access-9nbbf\") on node \"crc\" DevicePath \"\"" Mar 01 10:17:22 crc kubenswrapper[4792]: I0301 10:17:22.510306 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="434542fb734e119099c99109e2963f44067d819ceef6faa0696fa255b4e3073f" Mar 01 10:17:22 crc kubenswrapper[4792]: I0301 10:17:22.510367 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8szkd/crc-debug-zgt6k" Mar 01 10:17:23 crc kubenswrapper[4792]: I0301 10:17:23.122462 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8szkd/crc-debug-ph9f5"] Mar 01 10:17:23 crc kubenswrapper[4792]: E0301 10:17:23.122953 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21e1cd52-26d9-4c5d-b87b-b124e2b7cb06" containerName="container-00" Mar 01 10:17:23 crc kubenswrapper[4792]: I0301 10:17:23.122969 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="21e1cd52-26d9-4c5d-b87b-b124e2b7cb06" containerName="container-00" Mar 01 10:17:23 crc kubenswrapper[4792]: E0301 10:17:23.122979 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" containerName="extract-content" Mar 01 10:17:23 crc kubenswrapper[4792]: I0301 10:17:23.122987 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" containerName="extract-content" Mar 01 10:17:23 crc kubenswrapper[4792]: E0301 10:17:23.123010 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" containerName="registry-server" Mar 01 10:17:23 crc kubenswrapper[4792]: I0301 10:17:23.123019 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" containerName="registry-server" Mar 01 10:17:23 crc kubenswrapper[4792]: E0301 10:17:23.123043 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" containerName="extract-utilities" Mar 01 10:17:23 crc kubenswrapper[4792]: I0301 10:17:23.123051 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" containerName="extract-utilities" Mar 01 10:17:23 crc kubenswrapper[4792]: I0301 10:17:23.123303 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="21e1cd52-26d9-4c5d-b87b-b124e2b7cb06" containerName="container-00" Mar 01 10:17:23 crc kubenswrapper[4792]: I0301 10:17:23.123317 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fb9c2d9-68e1-4fca-9749-7abfe0778f3f" containerName="registry-server" Mar 01 10:17:23 crc kubenswrapper[4792]: I0301 10:17:23.124079 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8szkd/crc-debug-ph9f5" Mar 01 10:17:23 crc kubenswrapper[4792]: I0301 10:17:23.171416 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbr4g\" (UniqueName: \"kubernetes.io/projected/7f8b498c-79c9-4d69-85fe-5662bac5a08d-kube-api-access-wbr4g\") pod \"crc-debug-ph9f5\" (UID: \"7f8b498c-79c9-4d69-85fe-5662bac5a08d\") " pod="openshift-must-gather-8szkd/crc-debug-ph9f5" Mar 01 10:17:23 crc kubenswrapper[4792]: I0301 10:17:23.171538 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f8b498c-79c9-4d69-85fe-5662bac5a08d-host\") pod \"crc-debug-ph9f5\" (UID: \"7f8b498c-79c9-4d69-85fe-5662bac5a08d\") " pod="openshift-must-gather-8szkd/crc-debug-ph9f5" Mar 01 10:17:23 crc kubenswrapper[4792]: I0301 10:17:23.273151 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbr4g\" (UniqueName: \"kubernetes.io/projected/7f8b498c-79c9-4d69-85fe-5662bac5a08d-kube-api-access-wbr4g\") pod \"crc-debug-ph9f5\" (UID: \"7f8b498c-79c9-4d69-85fe-5662bac5a08d\") " pod="openshift-must-gather-8szkd/crc-debug-ph9f5" Mar 01 10:17:23 crc kubenswrapper[4792]: I0301 10:17:23.273240 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f8b498c-79c9-4d69-85fe-5662bac5a08d-host\") pod \"crc-debug-ph9f5\" (UID: \"7f8b498c-79c9-4d69-85fe-5662bac5a08d\") " pod="openshift-must-gather-8szkd/crc-debug-ph9f5" Mar 01 10:17:23 crc kubenswrapper[4792]: I0301 10:17:23.273436 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f8b498c-79c9-4d69-85fe-5662bac5a08d-host\") pod \"crc-debug-ph9f5\" (UID: \"7f8b498c-79c9-4d69-85fe-5662bac5a08d\") " pod="openshift-must-gather-8szkd/crc-debug-ph9f5" Mar 01 10:17:23 crc kubenswrapper[4792]: I0301 10:17:23.288618 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbr4g\" (UniqueName: \"kubernetes.io/projected/7f8b498c-79c9-4d69-85fe-5662bac5a08d-kube-api-access-wbr4g\") pod \"crc-debug-ph9f5\" (UID: \"7f8b498c-79c9-4d69-85fe-5662bac5a08d\") " pod="openshift-must-gather-8szkd/crc-debug-ph9f5" Mar 01 10:17:23 crc kubenswrapper[4792]: I0301 10:17:23.418367 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21e1cd52-26d9-4c5d-b87b-b124e2b7cb06" path="/var/lib/kubelet/pods/21e1cd52-26d9-4c5d-b87b-b124e2b7cb06/volumes" Mar 01 10:17:23 crc kubenswrapper[4792]: I0301 10:17:23.441058 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8szkd/crc-debug-ph9f5" Mar 01 10:17:23 crc kubenswrapper[4792]: I0301 10:17:23.546891 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8szkd/crc-debug-ph9f5" event={"ID":"7f8b498c-79c9-4d69-85fe-5662bac5a08d","Type":"ContainerStarted","Data":"8d94543550dd8ed17603b227cf614a7d05921b5c966f5daaf670da82374caa69"} Mar 01 10:17:24 crc kubenswrapper[4792]: I0301 10:17:24.559392 4792 generic.go:334] "Generic (PLEG): container finished" podID="7f8b498c-79c9-4d69-85fe-5662bac5a08d" containerID="df8ecbd7090c652c476da0720390ff4d5644e5341ba47e2dc96da76468154054" exitCode=0 Mar 01 10:17:24 crc kubenswrapper[4792]: I0301 10:17:24.559513 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8szkd/crc-debug-ph9f5" event={"ID":"7f8b498c-79c9-4d69-85fe-5662bac5a08d","Type":"ContainerDied","Data":"df8ecbd7090c652c476da0720390ff4d5644e5341ba47e2dc96da76468154054"} Mar 01 10:17:24 crc kubenswrapper[4792]: I0301 10:17:24.973591 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8szkd/crc-debug-ph9f5"] Mar 01 10:17:24 crc kubenswrapper[4792]: I0301 10:17:24.981579 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8szkd/crc-debug-ph9f5"] Mar 01 10:17:25 crc kubenswrapper[4792]: I0301 10:17:25.671531 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8szkd/crc-debug-ph9f5" Mar 01 10:17:25 crc kubenswrapper[4792]: I0301 10:17:25.721896 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbr4g\" (UniqueName: \"kubernetes.io/projected/7f8b498c-79c9-4d69-85fe-5662bac5a08d-kube-api-access-wbr4g\") pod \"7f8b498c-79c9-4d69-85fe-5662bac5a08d\" (UID: \"7f8b498c-79c9-4d69-85fe-5662bac5a08d\") " Mar 01 10:17:25 crc kubenswrapper[4792]: I0301 10:17:25.722016 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f8b498c-79c9-4d69-85fe-5662bac5a08d-host\") pod \"7f8b498c-79c9-4d69-85fe-5662bac5a08d\" (UID: \"7f8b498c-79c9-4d69-85fe-5662bac5a08d\") " Mar 01 10:17:25 crc kubenswrapper[4792]: I0301 10:17:25.722558 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f8b498c-79c9-4d69-85fe-5662bac5a08d-host" (OuterVolumeSpecName: "host") pod "7f8b498c-79c9-4d69-85fe-5662bac5a08d" (UID: "7f8b498c-79c9-4d69-85fe-5662bac5a08d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 10:17:25 crc kubenswrapper[4792]: I0301 10:17:25.728110 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f8b498c-79c9-4d69-85fe-5662bac5a08d-kube-api-access-wbr4g" (OuterVolumeSpecName: "kube-api-access-wbr4g") pod "7f8b498c-79c9-4d69-85fe-5662bac5a08d" (UID: "7f8b498c-79c9-4d69-85fe-5662bac5a08d"). InnerVolumeSpecName "kube-api-access-wbr4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:17:25 crc kubenswrapper[4792]: I0301 10:17:25.824953 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbr4g\" (UniqueName: \"kubernetes.io/projected/7f8b498c-79c9-4d69-85fe-5662bac5a08d-kube-api-access-wbr4g\") on node \"crc\" DevicePath \"\"" Mar 01 10:17:25 crc kubenswrapper[4792]: I0301 10:17:25.824999 4792 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f8b498c-79c9-4d69-85fe-5662bac5a08d-host\") on node \"crc\" DevicePath \"\"" Mar 01 10:17:26 crc kubenswrapper[4792]: I0301 10:17:26.240131 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8szkd/crc-debug-c6rzj"] Mar 01 10:17:26 crc kubenswrapper[4792]: E0301 10:17:26.240932 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f8b498c-79c9-4d69-85fe-5662bac5a08d" containerName="container-00" Mar 01 10:17:26 crc kubenswrapper[4792]: I0301 10:17:26.240949 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f8b498c-79c9-4d69-85fe-5662bac5a08d" containerName="container-00" Mar 01 10:17:26 crc kubenswrapper[4792]: I0301 10:17:26.241131 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f8b498c-79c9-4d69-85fe-5662bac5a08d" containerName="container-00" Mar 01 10:17:26 crc kubenswrapper[4792]: I0301 10:17:26.241724 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8szkd/crc-debug-c6rzj" Mar 01 10:17:26 crc kubenswrapper[4792]: I0301 10:17:26.334444 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxkxn\" (UniqueName: \"kubernetes.io/projected/3548cb06-25bd-4b92-8633-a56e0996925f-kube-api-access-sxkxn\") pod \"crc-debug-c6rzj\" (UID: \"3548cb06-25bd-4b92-8633-a56e0996925f\") " pod="openshift-must-gather-8szkd/crc-debug-c6rzj" Mar 01 10:17:26 crc kubenswrapper[4792]: I0301 10:17:26.334511 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3548cb06-25bd-4b92-8633-a56e0996925f-host\") pod \"crc-debug-c6rzj\" (UID: \"3548cb06-25bd-4b92-8633-a56e0996925f\") " pod="openshift-must-gather-8szkd/crc-debug-c6rzj" Mar 01 10:17:26 crc kubenswrapper[4792]: I0301 10:17:26.435745 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3548cb06-25bd-4b92-8633-a56e0996925f-host\") pod \"crc-debug-c6rzj\" (UID: \"3548cb06-25bd-4b92-8633-a56e0996925f\") " pod="openshift-must-gather-8szkd/crc-debug-c6rzj" Mar 01 10:17:26 crc kubenswrapper[4792]: I0301 10:17:26.435884 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3548cb06-25bd-4b92-8633-a56e0996925f-host\") pod \"crc-debug-c6rzj\" (UID: \"3548cb06-25bd-4b92-8633-a56e0996925f\") " pod="openshift-must-gather-8szkd/crc-debug-c6rzj" Mar 01 10:17:26 crc kubenswrapper[4792]: I0301 10:17:26.435994 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxkxn\" (UniqueName: \"kubernetes.io/projected/3548cb06-25bd-4b92-8633-a56e0996925f-kube-api-access-sxkxn\") pod \"crc-debug-c6rzj\" (UID: \"3548cb06-25bd-4b92-8633-a56e0996925f\") " pod="openshift-must-gather-8szkd/crc-debug-c6rzj" Mar 01 10:17:26 crc kubenswrapper[4792]: I0301 10:17:26.680129 4792 scope.go:117] "RemoveContainer" containerID="df8ecbd7090c652c476da0720390ff4d5644e5341ba47e2dc96da76468154054" Mar 01 10:17:26 crc kubenswrapper[4792]: I0301 10:17:26.680243 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8szkd/crc-debug-ph9f5" Mar 01 10:17:26 crc kubenswrapper[4792]: I0301 10:17:26.761588 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxkxn\" (UniqueName: \"kubernetes.io/projected/3548cb06-25bd-4b92-8633-a56e0996925f-kube-api-access-sxkxn\") pod \"crc-debug-c6rzj\" (UID: \"3548cb06-25bd-4b92-8633-a56e0996925f\") " pod="openshift-must-gather-8szkd/crc-debug-c6rzj" Mar 01 10:17:26 crc kubenswrapper[4792]: I0301 10:17:26.863471 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8szkd/crc-debug-c6rzj" Mar 01 10:17:27 crc kubenswrapper[4792]: W0301 10:17:27.032292 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3548cb06_25bd_4b92_8633_a56e0996925f.slice/crio-c4ec4652ea067166a99eb4f583932618d94a381e324521e0c66f56b8a107e32d WatchSource:0}: Error finding container c4ec4652ea067166a99eb4f583932618d94a381e324521e0c66f56b8a107e32d: Status 404 returned error can't find the container with id c4ec4652ea067166a99eb4f583932618d94a381e324521e0c66f56b8a107e32d Mar 01 10:17:27 crc kubenswrapper[4792]: I0301 10:17:27.409450 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:17:27 crc kubenswrapper[4792]: E0301 10:17:27.409997 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:17:27 crc kubenswrapper[4792]: I0301 10:17:27.418652 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f8b498c-79c9-4d69-85fe-5662bac5a08d" path="/var/lib/kubelet/pods/7f8b498c-79c9-4d69-85fe-5662bac5a08d/volumes" Mar 01 10:17:27 crc kubenswrapper[4792]: I0301 10:17:27.691121 4792 generic.go:334] "Generic (PLEG): container finished" podID="3548cb06-25bd-4b92-8633-a56e0996925f" containerID="1fb264ea34cd0df53641cbc5584d4107d16d25be8d770a5ddb0d84a29a984936" exitCode=0 Mar 01 10:17:27 crc kubenswrapper[4792]: I0301 10:17:27.691199 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8szkd/crc-debug-c6rzj" event={"ID":"3548cb06-25bd-4b92-8633-a56e0996925f","Type":"ContainerDied","Data":"1fb264ea34cd0df53641cbc5584d4107d16d25be8d770a5ddb0d84a29a984936"} Mar 01 10:17:27 crc kubenswrapper[4792]: I0301 10:17:27.691241 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8szkd/crc-debug-c6rzj" event={"ID":"3548cb06-25bd-4b92-8633-a56e0996925f","Type":"ContainerStarted","Data":"c4ec4652ea067166a99eb4f583932618d94a381e324521e0c66f56b8a107e32d"} Mar 01 10:17:27 crc kubenswrapper[4792]: I0301 10:17:27.731556 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8szkd/crc-debug-c6rzj"] Mar 01 10:17:27 crc kubenswrapper[4792]: I0301 10:17:27.740227 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8szkd/crc-debug-c6rzj"] Mar 01 10:17:28 crc kubenswrapper[4792]: I0301 10:17:28.879634 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8szkd/crc-debug-c6rzj" Mar 01 10:17:28 crc kubenswrapper[4792]: I0301 10:17:28.912420 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3548cb06-25bd-4b92-8633-a56e0996925f-host\") pod \"3548cb06-25bd-4b92-8633-a56e0996925f\" (UID: \"3548cb06-25bd-4b92-8633-a56e0996925f\") " Mar 01 10:17:28 crc kubenswrapper[4792]: I0301 10:17:28.912536 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxkxn\" (UniqueName: \"kubernetes.io/projected/3548cb06-25bd-4b92-8633-a56e0996925f-kube-api-access-sxkxn\") pod \"3548cb06-25bd-4b92-8633-a56e0996925f\" (UID: \"3548cb06-25bd-4b92-8633-a56e0996925f\") " Mar 01 10:17:28 crc kubenswrapper[4792]: I0301 10:17:28.912535 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3548cb06-25bd-4b92-8633-a56e0996925f-host" (OuterVolumeSpecName: "host") pod "3548cb06-25bd-4b92-8633-a56e0996925f" (UID: "3548cb06-25bd-4b92-8633-a56e0996925f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 10:17:28 crc kubenswrapper[4792]: I0301 10:17:28.912942 4792 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3548cb06-25bd-4b92-8633-a56e0996925f-host\") on node \"crc\" DevicePath \"\"" Mar 01 10:17:28 crc kubenswrapper[4792]: I0301 10:17:28.918069 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3548cb06-25bd-4b92-8633-a56e0996925f-kube-api-access-sxkxn" (OuterVolumeSpecName: "kube-api-access-sxkxn") pod "3548cb06-25bd-4b92-8633-a56e0996925f" (UID: "3548cb06-25bd-4b92-8633-a56e0996925f"). InnerVolumeSpecName "kube-api-access-sxkxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:17:29 crc kubenswrapper[4792]: I0301 10:17:29.014777 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxkxn\" (UniqueName: \"kubernetes.io/projected/3548cb06-25bd-4b92-8633-a56e0996925f-kube-api-access-sxkxn\") on node \"crc\" DevicePath \"\"" Mar 01 10:17:29 crc kubenswrapper[4792]: I0301 10:17:29.421209 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3548cb06-25bd-4b92-8633-a56e0996925f" path="/var/lib/kubelet/pods/3548cb06-25bd-4b92-8633-a56e0996925f/volumes" Mar 01 10:17:29 crc kubenswrapper[4792]: I0301 10:17:29.709117 4792 scope.go:117] "RemoveContainer" containerID="1fb264ea34cd0df53641cbc5584d4107d16d25be8d770a5ddb0d84a29a984936" Mar 01 10:17:29 crc kubenswrapper[4792]: I0301 10:17:29.709494 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8szkd/crc-debug-c6rzj" Mar 01 10:17:39 crc kubenswrapper[4792]: I0301 10:17:39.768961 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8qspr"] Mar 01 10:17:39 crc kubenswrapper[4792]: E0301 10:17:39.769869 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3548cb06-25bd-4b92-8633-a56e0996925f" containerName="container-00" Mar 01 10:17:39 crc kubenswrapper[4792]: I0301 10:17:39.769881 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3548cb06-25bd-4b92-8633-a56e0996925f" containerName="container-00" Mar 01 10:17:39 crc kubenswrapper[4792]: I0301 10:17:39.770097 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="3548cb06-25bd-4b92-8633-a56e0996925f" containerName="container-00" Mar 01 10:17:39 crc kubenswrapper[4792]: I0301 10:17:39.771624 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8qspr" Mar 01 10:17:39 crc kubenswrapper[4792]: I0301 10:17:39.793301 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8qspr"] Mar 01 10:17:39 crc kubenswrapper[4792]: I0301 10:17:39.834079 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4edec827-16e6-4138-a7f0-0b84d0c3dfa6-utilities\") pod \"community-operators-8qspr\" (UID: \"4edec827-16e6-4138-a7f0-0b84d0c3dfa6\") " pod="openshift-marketplace/community-operators-8qspr" Mar 01 10:17:39 crc kubenswrapper[4792]: I0301 10:17:39.834154 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4edec827-16e6-4138-a7f0-0b84d0c3dfa6-catalog-content\") pod \"community-operators-8qspr\" (UID: \"4edec827-16e6-4138-a7f0-0b84d0c3dfa6\") " pod="openshift-marketplace/community-operators-8qspr" Mar 01 10:17:39 crc kubenswrapper[4792]: I0301 10:17:39.834208 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwqhw\" (UniqueName: \"kubernetes.io/projected/4edec827-16e6-4138-a7f0-0b84d0c3dfa6-kube-api-access-qwqhw\") pod \"community-operators-8qspr\" (UID: \"4edec827-16e6-4138-a7f0-0b84d0c3dfa6\") " pod="openshift-marketplace/community-operators-8qspr" Mar 01 10:17:39 crc kubenswrapper[4792]: I0301 10:17:39.935456 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4edec827-16e6-4138-a7f0-0b84d0c3dfa6-catalog-content\") pod \"community-operators-8qspr\" (UID: \"4edec827-16e6-4138-a7f0-0b84d0c3dfa6\") " pod="openshift-marketplace/community-operators-8qspr" Mar 01 10:17:39 crc kubenswrapper[4792]: I0301 10:17:39.935531 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwqhw\" (UniqueName: \"kubernetes.io/projected/4edec827-16e6-4138-a7f0-0b84d0c3dfa6-kube-api-access-qwqhw\") pod \"community-operators-8qspr\" (UID: \"4edec827-16e6-4138-a7f0-0b84d0c3dfa6\") " pod="openshift-marketplace/community-operators-8qspr" Mar 01 10:17:39 crc kubenswrapper[4792]: I0301 10:17:39.935644 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4edec827-16e6-4138-a7f0-0b84d0c3dfa6-utilities\") pod \"community-operators-8qspr\" (UID: \"4edec827-16e6-4138-a7f0-0b84d0c3dfa6\") " pod="openshift-marketplace/community-operators-8qspr" Mar 01 10:17:39 crc kubenswrapper[4792]: I0301 10:17:39.936147 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4edec827-16e6-4138-a7f0-0b84d0c3dfa6-utilities\") pod \"community-operators-8qspr\" (UID: \"4edec827-16e6-4138-a7f0-0b84d0c3dfa6\") " pod="openshift-marketplace/community-operators-8qspr" Mar 01 10:17:39 crc kubenswrapper[4792]: I0301 10:17:39.936356 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4edec827-16e6-4138-a7f0-0b84d0c3dfa6-catalog-content\") pod \"community-operators-8qspr\" (UID: \"4edec827-16e6-4138-a7f0-0b84d0c3dfa6\") " pod="openshift-marketplace/community-operators-8qspr" Mar 01 10:17:39 crc kubenswrapper[4792]: I0301 10:17:39.957038 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwqhw\" (UniqueName: \"kubernetes.io/projected/4edec827-16e6-4138-a7f0-0b84d0c3dfa6-kube-api-access-qwqhw\") pod \"community-operators-8qspr\" (UID: \"4edec827-16e6-4138-a7f0-0b84d0c3dfa6\") " pod="openshift-marketplace/community-operators-8qspr" Mar 01 10:17:40 crc kubenswrapper[4792]: I0301 10:17:40.090244 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8qspr" Mar 01 10:17:40 crc kubenswrapper[4792]: I0301 10:17:40.712756 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8qspr"] Mar 01 10:17:40 crc kubenswrapper[4792]: I0301 10:17:40.798529 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qspr" event={"ID":"4edec827-16e6-4138-a7f0-0b84d0c3dfa6","Type":"ContainerStarted","Data":"6c752bb11d7e7fc2f68f413de4a7285c4f5635a9012e66eda517f2d86c1445a2"} Mar 01 10:17:41 crc kubenswrapper[4792]: I0301 10:17:41.809054 4792 generic.go:334] "Generic (PLEG): container finished" podID="4edec827-16e6-4138-a7f0-0b84d0c3dfa6" containerID="43c8c65f4329bbe44b7faf9b7a5c4619edafb02deae1f7f400f2f685457f7e17" exitCode=0 Mar 01 10:17:41 crc kubenswrapper[4792]: I0301 10:17:41.809179 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qspr" event={"ID":"4edec827-16e6-4138-a7f0-0b84d0c3dfa6","Type":"ContainerDied","Data":"43c8c65f4329bbe44b7faf9b7a5c4619edafb02deae1f7f400f2f685457f7e17"} Mar 01 10:17:42 crc kubenswrapper[4792]: I0301 10:17:42.408757 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:17:42 crc kubenswrapper[4792]: E0301 10:17:42.409259 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:17:42 crc kubenswrapper[4792]: I0301 10:17:42.819373 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qspr" event={"ID":"4edec827-16e6-4138-a7f0-0b84d0c3dfa6","Type":"ContainerStarted","Data":"608a7ad1aff7c1f3a38df93cfda6b576e32ba8d1b48e9a116624e3763fb48e60"} Mar 01 10:17:44 crc kubenswrapper[4792]: I0301 10:17:44.837561 4792 generic.go:334] "Generic (PLEG): container finished" podID="4edec827-16e6-4138-a7f0-0b84d0c3dfa6" containerID="608a7ad1aff7c1f3a38df93cfda6b576e32ba8d1b48e9a116624e3763fb48e60" exitCode=0 Mar 01 10:17:44 crc kubenswrapper[4792]: I0301 10:17:44.837658 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qspr" event={"ID":"4edec827-16e6-4138-a7f0-0b84d0c3dfa6","Type":"ContainerDied","Data":"608a7ad1aff7c1f3a38df93cfda6b576e32ba8d1b48e9a116624e3763fb48e60"} Mar 01 10:17:45 crc kubenswrapper[4792]: I0301 10:17:45.853823 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qspr" event={"ID":"4edec827-16e6-4138-a7f0-0b84d0c3dfa6","Type":"ContainerStarted","Data":"8333c23bb09de5c284f9191008ad8bb3eebc5ca29e546e6a1e85f7f7a48bcdaa"} Mar 01 10:17:45 crc kubenswrapper[4792]: I0301 10:17:45.895720 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8qspr" podStartSLOduration=3.497556296 podStartE2EDuration="6.895703404s" podCreationTimestamp="2026-03-01 10:17:39 +0000 UTC" firstStartedPulling="2026-03-01 10:17:41.81168804 +0000 UTC m=+4191.053567237" lastFinishedPulling="2026-03-01 10:17:45.209835148 +0000 UTC m=+4194.451714345" observedRunningTime="2026-03-01 10:17:45.884296201 +0000 UTC m=+4195.126175398" watchObservedRunningTime="2026-03-01 10:17:45.895703404 +0000 UTC m=+4195.137582601" Mar 01 10:17:50 crc kubenswrapper[4792]: I0301 10:17:50.090687 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8qspr" Mar 01 10:17:50 crc kubenswrapper[4792]: I0301 10:17:50.091279 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8qspr" Mar 01 10:17:50 crc kubenswrapper[4792]: I0301 10:17:50.201041 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8qspr" Mar 01 10:17:50 crc kubenswrapper[4792]: I0301 10:17:50.959624 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8qspr" Mar 01 10:17:51 crc kubenswrapper[4792]: I0301 10:17:51.031051 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8qspr"] Mar 01 10:17:52 crc kubenswrapper[4792]: I0301 10:17:52.916566 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8qspr" podUID="4edec827-16e6-4138-a7f0-0b84d0c3dfa6" containerName="registry-server" containerID="cri-o://8333c23bb09de5c284f9191008ad8bb3eebc5ca29e546e6a1e85f7f7a48bcdaa" gracePeriod=2 Mar 01 10:17:53 crc kubenswrapper[4792]: I0301 10:17:53.895467 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8qspr" Mar 01 10:17:53 crc kubenswrapper[4792]: I0301 10:17:53.922419 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4edec827-16e6-4138-a7f0-0b84d0c3dfa6-catalog-content\") pod \"4edec827-16e6-4138-a7f0-0b84d0c3dfa6\" (UID: \"4edec827-16e6-4138-a7f0-0b84d0c3dfa6\") " Mar 01 10:17:53 crc kubenswrapper[4792]: I0301 10:17:53.922504 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4edec827-16e6-4138-a7f0-0b84d0c3dfa6-utilities\") pod \"4edec827-16e6-4138-a7f0-0b84d0c3dfa6\" (UID: \"4edec827-16e6-4138-a7f0-0b84d0c3dfa6\") " Mar 01 10:17:53 crc kubenswrapper[4792]: I0301 10:17:53.922522 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwqhw\" (UniqueName: \"kubernetes.io/projected/4edec827-16e6-4138-a7f0-0b84d0c3dfa6-kube-api-access-qwqhw\") pod \"4edec827-16e6-4138-a7f0-0b84d0c3dfa6\" (UID: \"4edec827-16e6-4138-a7f0-0b84d0c3dfa6\") " Mar 01 10:17:53 crc kubenswrapper[4792]: I0301 10:17:53.924125 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4edec827-16e6-4138-a7f0-0b84d0c3dfa6-utilities" (OuterVolumeSpecName: "utilities") pod "4edec827-16e6-4138-a7f0-0b84d0c3dfa6" (UID: "4edec827-16e6-4138-a7f0-0b84d0c3dfa6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:17:53 crc kubenswrapper[4792]: I0301 10:17:53.946705 4792 generic.go:334] "Generic (PLEG): container finished" podID="4edec827-16e6-4138-a7f0-0b84d0c3dfa6" containerID="8333c23bb09de5c284f9191008ad8bb3eebc5ca29e546e6a1e85f7f7a48bcdaa" exitCode=0 Mar 01 10:17:53 crc kubenswrapper[4792]: I0301 10:17:53.946753 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qspr" event={"ID":"4edec827-16e6-4138-a7f0-0b84d0c3dfa6","Type":"ContainerDied","Data":"8333c23bb09de5c284f9191008ad8bb3eebc5ca29e546e6a1e85f7f7a48bcdaa"} Mar 01 10:17:53 crc kubenswrapper[4792]: I0301 10:17:53.946785 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qspr" event={"ID":"4edec827-16e6-4138-a7f0-0b84d0c3dfa6","Type":"ContainerDied","Data":"6c752bb11d7e7fc2f68f413de4a7285c4f5635a9012e66eda517f2d86c1445a2"} Mar 01 10:17:53 crc kubenswrapper[4792]: I0301 10:17:53.946809 4792 scope.go:117] "RemoveContainer" containerID="8333c23bb09de5c284f9191008ad8bb3eebc5ca29e546e6a1e85f7f7a48bcdaa" Mar 01 10:17:53 crc kubenswrapper[4792]: I0301 10:17:53.947431 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8qspr" Mar 01 10:17:53 crc kubenswrapper[4792]: I0301 10:17:53.957917 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4edec827-16e6-4138-a7f0-0b84d0c3dfa6-kube-api-access-qwqhw" (OuterVolumeSpecName: "kube-api-access-qwqhw") pod "4edec827-16e6-4138-a7f0-0b84d0c3dfa6" (UID: "4edec827-16e6-4138-a7f0-0b84d0c3dfa6"). InnerVolumeSpecName "kube-api-access-qwqhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:17:54 crc kubenswrapper[4792]: I0301 10:17:54.007858 4792 scope.go:117] "RemoveContainer" containerID="608a7ad1aff7c1f3a38df93cfda6b576e32ba8d1b48e9a116624e3763fb48e60" Mar 01 10:17:54 crc kubenswrapper[4792]: I0301 10:17:54.026753 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4edec827-16e6-4138-a7f0-0b84d0c3dfa6-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 10:17:54 crc kubenswrapper[4792]: I0301 10:17:54.026785 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwqhw\" (UniqueName: \"kubernetes.io/projected/4edec827-16e6-4138-a7f0-0b84d0c3dfa6-kube-api-access-qwqhw\") on node \"crc\" DevicePath \"\"" Mar 01 10:17:54 crc kubenswrapper[4792]: I0301 10:17:54.041557 4792 scope.go:117] "RemoveContainer" containerID="43c8c65f4329bbe44b7faf9b7a5c4619edafb02deae1f7f400f2f685457f7e17" Mar 01 10:17:54 crc kubenswrapper[4792]: I0301 10:17:54.094896 4792 scope.go:117] "RemoveContainer" containerID="8333c23bb09de5c284f9191008ad8bb3eebc5ca29e546e6a1e85f7f7a48bcdaa" Mar 01 10:17:54 crc kubenswrapper[4792]: E0301 10:17:54.095711 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8333c23bb09de5c284f9191008ad8bb3eebc5ca29e546e6a1e85f7f7a48bcdaa\": container with ID starting with 8333c23bb09de5c284f9191008ad8bb3eebc5ca29e546e6a1e85f7f7a48bcdaa not found: ID does not exist" containerID="8333c23bb09de5c284f9191008ad8bb3eebc5ca29e546e6a1e85f7f7a48bcdaa" Mar 01 10:17:54 crc kubenswrapper[4792]: I0301 10:17:54.095753 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8333c23bb09de5c284f9191008ad8bb3eebc5ca29e546e6a1e85f7f7a48bcdaa"} err="failed to get container status \"8333c23bb09de5c284f9191008ad8bb3eebc5ca29e546e6a1e85f7f7a48bcdaa\": rpc error: code = NotFound desc = could not find container \"8333c23bb09de5c284f9191008ad8bb3eebc5ca29e546e6a1e85f7f7a48bcdaa\": container with ID starting with 8333c23bb09de5c284f9191008ad8bb3eebc5ca29e546e6a1e85f7f7a48bcdaa not found: ID does not exist" Mar 01 10:17:54 crc kubenswrapper[4792]: I0301 10:17:54.095778 4792 scope.go:117] "RemoveContainer" containerID="608a7ad1aff7c1f3a38df93cfda6b576e32ba8d1b48e9a116624e3763fb48e60" Mar 01 10:17:54 crc kubenswrapper[4792]: E0301 10:17:54.096276 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"608a7ad1aff7c1f3a38df93cfda6b576e32ba8d1b48e9a116624e3763fb48e60\": container with ID starting with 608a7ad1aff7c1f3a38df93cfda6b576e32ba8d1b48e9a116624e3763fb48e60 not found: ID does not exist" containerID="608a7ad1aff7c1f3a38df93cfda6b576e32ba8d1b48e9a116624e3763fb48e60" Mar 01 10:17:54 crc kubenswrapper[4792]: I0301 10:17:54.096324 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"608a7ad1aff7c1f3a38df93cfda6b576e32ba8d1b48e9a116624e3763fb48e60"} err="failed to get container status \"608a7ad1aff7c1f3a38df93cfda6b576e32ba8d1b48e9a116624e3763fb48e60\": rpc error: code = NotFound desc = could not find container \"608a7ad1aff7c1f3a38df93cfda6b576e32ba8d1b48e9a116624e3763fb48e60\": container with ID starting with 608a7ad1aff7c1f3a38df93cfda6b576e32ba8d1b48e9a116624e3763fb48e60 not found: ID does not exist" Mar 01 10:17:54 crc kubenswrapper[4792]: I0301 10:17:54.096367 4792 scope.go:117] "RemoveContainer" containerID="43c8c65f4329bbe44b7faf9b7a5c4619edafb02deae1f7f400f2f685457f7e17" Mar 01 10:17:54 crc kubenswrapper[4792]: E0301 10:17:54.096770 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43c8c65f4329bbe44b7faf9b7a5c4619edafb02deae1f7f400f2f685457f7e17\": container with ID starting with 43c8c65f4329bbe44b7faf9b7a5c4619edafb02deae1f7f400f2f685457f7e17 not found: ID does not exist" containerID="43c8c65f4329bbe44b7faf9b7a5c4619edafb02deae1f7f400f2f685457f7e17" Mar 01 10:17:54 crc kubenswrapper[4792]: I0301 10:17:54.096811 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43c8c65f4329bbe44b7faf9b7a5c4619edafb02deae1f7f400f2f685457f7e17"} err="failed to get container status \"43c8c65f4329bbe44b7faf9b7a5c4619edafb02deae1f7f400f2f685457f7e17\": rpc error: code = NotFound desc = could not find container \"43c8c65f4329bbe44b7faf9b7a5c4619edafb02deae1f7f400f2f685457f7e17\": container with ID starting with 43c8c65f4329bbe44b7faf9b7a5c4619edafb02deae1f7f400f2f685457f7e17 not found: ID does not exist" Mar 01 10:17:54 crc kubenswrapper[4792]: I0301 10:17:54.239914 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4edec827-16e6-4138-a7f0-0b84d0c3dfa6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4edec827-16e6-4138-a7f0-0b84d0c3dfa6" (UID: "4edec827-16e6-4138-a7f0-0b84d0c3dfa6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:17:54 crc kubenswrapper[4792]: I0301 10:17:54.284478 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8qspr"] Mar 01 10:17:54 crc kubenswrapper[4792]: I0301 10:17:54.292287 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8qspr"] Mar 01 10:17:54 crc kubenswrapper[4792]: I0301 10:17:54.337354 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4edec827-16e6-4138-a7f0-0b84d0c3dfa6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 10:17:55 crc kubenswrapper[4792]: I0301 10:17:55.408897 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:17:55 crc kubenswrapper[4792]: E0301 10:17:55.409533 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:17:55 crc kubenswrapper[4792]: I0301 10:17:55.418794 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4edec827-16e6-4138-a7f0-0b84d0c3dfa6" path="/var/lib/kubelet/pods/4edec827-16e6-4138-a7f0-0b84d0c3dfa6/volumes" Mar 01 10:18:00 crc kubenswrapper[4792]: I0301 10:18:00.148976 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539338-kcgfx"] Mar 01 10:18:00 crc kubenswrapper[4792]: E0301 10:18:00.149751 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4edec827-16e6-4138-a7f0-0b84d0c3dfa6" containerName="registry-server" Mar 01 10:18:00 crc kubenswrapper[4792]: I0301 10:18:00.149764 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4edec827-16e6-4138-a7f0-0b84d0c3dfa6" containerName="registry-server" Mar 01 10:18:00 crc kubenswrapper[4792]: E0301 10:18:00.149793 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4edec827-16e6-4138-a7f0-0b84d0c3dfa6" containerName="extract-content" Mar 01 10:18:00 crc kubenswrapper[4792]: I0301 10:18:00.149799 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4edec827-16e6-4138-a7f0-0b84d0c3dfa6" containerName="extract-content" Mar 01 10:18:00 crc kubenswrapper[4792]: E0301 10:18:00.149814 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4edec827-16e6-4138-a7f0-0b84d0c3dfa6" containerName="extract-utilities" Mar 01 10:18:00 crc kubenswrapper[4792]: I0301 10:18:00.149821 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4edec827-16e6-4138-a7f0-0b84d0c3dfa6" containerName="extract-utilities" Mar 01 10:18:00 crc kubenswrapper[4792]: I0301 10:18:00.150660 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4edec827-16e6-4138-a7f0-0b84d0c3dfa6" containerName="registry-server" Mar 01 10:18:00 crc kubenswrapper[4792]: I0301 10:18:00.212203 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539338-kcgfx" Mar 01 10:18:00 crc kubenswrapper[4792]: I0301 10:18:00.215028 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:18:00 crc kubenswrapper[4792]: I0301 10:18:00.216383 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:18:00 crc kubenswrapper[4792]: I0301 10:18:00.216997 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:18:00 crc kubenswrapper[4792]: I0301 10:18:00.219670 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539338-kcgfx"] Mar 01 10:18:00 crc kubenswrapper[4792]: I0301 10:18:00.255034 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwkt8\" (UniqueName: \"kubernetes.io/projected/a36fce8b-7564-4d74-b3ad-9bfe9979cb67-kube-api-access-gwkt8\") pod \"auto-csr-approver-29539338-kcgfx\" (UID: \"a36fce8b-7564-4d74-b3ad-9bfe9979cb67\") " pod="openshift-infra/auto-csr-approver-29539338-kcgfx" Mar 01 10:18:00 crc kubenswrapper[4792]: I0301 10:18:00.356998 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwkt8\" (UniqueName: \"kubernetes.io/projected/a36fce8b-7564-4d74-b3ad-9bfe9979cb67-kube-api-access-gwkt8\") pod \"auto-csr-approver-29539338-kcgfx\" (UID: \"a36fce8b-7564-4d74-b3ad-9bfe9979cb67\") " pod="openshift-infra/auto-csr-approver-29539338-kcgfx" Mar 01 10:18:00 crc kubenswrapper[4792]: I0301 10:18:00.377874 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwkt8\" (UniqueName: \"kubernetes.io/projected/a36fce8b-7564-4d74-b3ad-9bfe9979cb67-kube-api-access-gwkt8\") pod \"auto-csr-approver-29539338-kcgfx\" (UID: \"a36fce8b-7564-4d74-b3ad-9bfe9979cb67\") " pod="openshift-infra/auto-csr-approver-29539338-kcgfx" Mar 01 10:18:00 crc kubenswrapper[4792]: I0301 10:18:00.559169 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539338-kcgfx" Mar 01 10:18:01 crc kubenswrapper[4792]: I0301 10:18:01.018677 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539338-kcgfx"] Mar 01 10:18:02 crc kubenswrapper[4792]: I0301 10:18:02.028793 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539338-kcgfx" event={"ID":"a36fce8b-7564-4d74-b3ad-9bfe9979cb67","Type":"ContainerStarted","Data":"122486aa561c7db6b94b6a3f7bfb44fc59bc39758ef217f2d542471c2f785f9f"} Mar 01 10:18:03 crc kubenswrapper[4792]: I0301 10:18:03.038889 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539338-kcgfx" event={"ID":"a36fce8b-7564-4d74-b3ad-9bfe9979cb67","Type":"ContainerStarted","Data":"dd7d1b10f4227bfce662716c9cbd0e72c4c6de3ac564b41925cd39f0160eafdf"} Mar 01 10:18:03 crc kubenswrapper[4792]: I0301 10:18:03.056535 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29539338-kcgfx" podStartSLOduration=2.158880776 podStartE2EDuration="3.056513513s" podCreationTimestamp="2026-03-01 10:18:00 +0000 UTC" firstStartedPulling="2026-03-01 10:18:01.166538424 +0000 UTC m=+4210.408417621" lastFinishedPulling="2026-03-01 10:18:02.064171161 +0000 UTC m=+4211.306050358" observedRunningTime="2026-03-01 10:18:03.053052547 +0000 UTC m=+4212.294931764" watchObservedRunningTime="2026-03-01 10:18:03.056513513 +0000 UTC m=+4212.298392720" Mar 01 10:18:04 crc kubenswrapper[4792]: I0301 10:18:04.048744 4792 generic.go:334] "Generic (PLEG): container finished" podID="a36fce8b-7564-4d74-b3ad-9bfe9979cb67" containerID="dd7d1b10f4227bfce662716c9cbd0e72c4c6de3ac564b41925cd39f0160eafdf" exitCode=0 Mar 01 10:18:04 crc kubenswrapper[4792]: I0301 10:18:04.048804 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539338-kcgfx" event={"ID":"a36fce8b-7564-4d74-b3ad-9bfe9979cb67","Type":"ContainerDied","Data":"dd7d1b10f4227bfce662716c9cbd0e72c4c6de3ac564b41925cd39f0160eafdf"} Mar 01 10:18:05 crc kubenswrapper[4792]: I0301 10:18:05.434307 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539338-kcgfx" Mar 01 10:18:05 crc kubenswrapper[4792]: I0301 10:18:05.581423 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwkt8\" (UniqueName: \"kubernetes.io/projected/a36fce8b-7564-4d74-b3ad-9bfe9979cb67-kube-api-access-gwkt8\") pod \"a36fce8b-7564-4d74-b3ad-9bfe9979cb67\" (UID: \"a36fce8b-7564-4d74-b3ad-9bfe9979cb67\") " Mar 01 10:18:05 crc kubenswrapper[4792]: I0301 10:18:05.594481 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a36fce8b-7564-4d74-b3ad-9bfe9979cb67-kube-api-access-gwkt8" (OuterVolumeSpecName: "kube-api-access-gwkt8") pod "a36fce8b-7564-4d74-b3ad-9bfe9979cb67" (UID: "a36fce8b-7564-4d74-b3ad-9bfe9979cb67"). InnerVolumeSpecName "kube-api-access-gwkt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:18:05 crc kubenswrapper[4792]: I0301 10:18:05.684695 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwkt8\" (UniqueName: \"kubernetes.io/projected/a36fce8b-7564-4d74-b3ad-9bfe9979cb67-kube-api-access-gwkt8\") on node \"crc\" DevicePath \"\"" Mar 01 10:18:06 crc kubenswrapper[4792]: I0301 10:18:06.066001 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539338-kcgfx" event={"ID":"a36fce8b-7564-4d74-b3ad-9bfe9979cb67","Type":"ContainerDied","Data":"122486aa561c7db6b94b6a3f7bfb44fc59bc39758ef217f2d542471c2f785f9f"} Mar 01 10:18:06 crc kubenswrapper[4792]: I0301 10:18:06.066178 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="122486aa561c7db6b94b6a3f7bfb44fc59bc39758ef217f2d542471c2f785f9f" Mar 01 10:18:06 crc kubenswrapper[4792]: I0301 10:18:06.066053 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539338-kcgfx" Mar 01 10:18:06 crc kubenswrapper[4792]: I0301 10:18:06.122329 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539332-f6brk"] Mar 01 10:18:06 crc kubenswrapper[4792]: I0301 10:18:06.131448 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539332-f6brk"] Mar 01 10:18:07 crc kubenswrapper[4792]: I0301 10:18:07.421042 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f9c6b5a-8834-45a7-bf9d-000bcfd068f3" path="/var/lib/kubelet/pods/5f9c6b5a-8834-45a7-bf9d-000bcfd068f3/volumes" Mar 01 10:18:08 crc kubenswrapper[4792]: I0301 10:18:08.409064 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:18:08 crc kubenswrapper[4792]: E0301 10:18:08.409629 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:18:19 crc kubenswrapper[4792]: I0301 10:18:19.409035 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:18:19 crc kubenswrapper[4792]: E0301 10:18:19.409792 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:18:26 crc kubenswrapper[4792]: I0301 10:18:26.234238 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7f86869f48-jg6nw_6c97472a-b6b7-4fc4-b872-a318812f0999/barbican-api/0.log" Mar 01 10:18:26 crc kubenswrapper[4792]: I0301 10:18:26.492845 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7f86869f48-jg6nw_6c97472a-b6b7-4fc4-b872-a318812f0999/barbican-api-log/0.log" Mar 01 10:18:26 crc kubenswrapper[4792]: I0301 10:18:26.535789 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-64f98fd86b-96l6n_d30c642c-b4ae-495a-8acd-cc8be4a0f412/barbican-keystone-listener/0.log" Mar 01 10:18:26 crc kubenswrapper[4792]: I0301 10:18:26.581592 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-64f98fd86b-96l6n_d30c642c-b4ae-495a-8acd-cc8be4a0f412/barbican-keystone-listener-log/0.log" Mar 01 10:18:27 crc kubenswrapper[4792]: I0301 10:18:27.261870 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-65f4d58895-tvn59_26fbd30a-a485-4463-9aac-bb695c43e9e3/barbican-worker-log/0.log" Mar 01 10:18:27 crc kubenswrapper[4792]: I0301 10:18:27.267907 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-65f4d58895-tvn59_26fbd30a-a485-4463-9aac-bb695c43e9e3/barbican-worker/0.log" Mar 01 10:18:27 crc kubenswrapper[4792]: I0301 10:18:27.489792 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb_1201ca91-41eb-45d0-991d-71883b4014ae/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:18:27 crc kubenswrapper[4792]: I0301 10:18:27.529741 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_63238274-bc2e-4686-8371-e891944269f9/ceilometer-central-agent/0.log" Mar 01 10:18:27 crc kubenswrapper[4792]: I0301 10:18:27.589373 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_63238274-bc2e-4686-8371-e891944269f9/ceilometer-notification-agent/0.log" Mar 01 10:18:27 crc kubenswrapper[4792]: I0301 10:18:27.707850 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_63238274-bc2e-4686-8371-e891944269f9/proxy-httpd/0.log" Mar 01 10:18:27 crc kubenswrapper[4792]: I0301 10:18:27.837804 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_63238274-bc2e-4686-8371-e891944269f9/sg-core/0.log" Mar 01 10:18:28 crc kubenswrapper[4792]: I0301 10:18:28.089035 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj_f3a428e9-b35d-4f80-bb40-c158095d5bfa/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:18:28 crc kubenswrapper[4792]: I0301 10:18:28.527023 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m_2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:18:28 crc kubenswrapper[4792]: I0301 10:18:28.644718 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_084f9db1-15eb-458c-8b43-aeb5dbb0555f/cinder-api-log/0.log" Mar 01 10:18:28 crc kubenswrapper[4792]: I0301 10:18:28.663304 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_084f9db1-15eb-458c-8b43-aeb5dbb0555f/cinder-api/0.log" Mar 01 10:18:28 crc kubenswrapper[4792]: I0301 10:18:28.878961 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_23d15722-3d0f-44ce-ac55-eba67760f0e9/probe/0.log" Mar 01 10:18:29 crc kubenswrapper[4792]: I0301 10:18:29.110818 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_23d15722-3d0f-44ce-ac55-eba67760f0e9/cinder-backup/0.log" Mar 01 10:18:29 crc kubenswrapper[4792]: I0301 10:18:29.187064 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_688f590f-ae5c-4caf-b8c7-013a118f42c5/cinder-scheduler/0.log" Mar 01 10:18:29 crc kubenswrapper[4792]: I0301 10:18:29.279053 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_688f590f-ae5c-4caf-b8c7-013a118f42c5/probe/0.log" Mar 01 10:18:29 crc kubenswrapper[4792]: I0301 10:18:29.384399 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_d3ca4743-fa6c-4e2e-b2c8-b2362f44a727/cinder-volume/0.log" Mar 01 10:18:29 crc kubenswrapper[4792]: I0301 10:18:29.462559 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_d3ca4743-fa6c-4e2e-b2c8-b2362f44a727/probe/0.log" Mar 01 10:18:29 crc kubenswrapper[4792]: I0301 10:18:29.709115 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5_cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:18:29 crc kubenswrapper[4792]: I0301 10:18:29.749609 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-dtgks_f25228f4-912f-408c-a1d6-9279c350b767/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:18:29 crc kubenswrapper[4792]: I0301 10:18:29.907223 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-86c6bdcc4c-fqgkv_49541358-1fd0-4d1d-8b61-0c618994dfc0/init/0.log" Mar 01 10:18:30 crc kubenswrapper[4792]: I0301 10:18:30.125056 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-86c6bdcc4c-fqgkv_49541358-1fd0-4d1d-8b61-0c618994dfc0/init/0.log" Mar 01 10:18:30 crc kubenswrapper[4792]: I0301 10:18:30.247410 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_52b189da-3327-40c1-bf22-a842b0980593/glance-httpd/0.log" Mar 01 10:18:30 crc kubenswrapper[4792]: I0301 10:18:30.276184 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-86c6bdcc4c-fqgkv_49541358-1fd0-4d1d-8b61-0c618994dfc0/dnsmasq-dns/0.log" Mar 01 10:18:30 crc kubenswrapper[4792]: I0301 10:18:30.362984 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_52b189da-3327-40c1-bf22-a842b0980593/glance-log/0.log" Mar 01 10:18:30 crc kubenswrapper[4792]: I0301 10:18:30.554387 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_9d055103-6c35-481f-820a-7aa363543404/glance-httpd/0.log" Mar 01 10:18:30 crc kubenswrapper[4792]: I0301 10:18:30.616969 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_9d055103-6c35-481f-820a-7aa363543404/glance-log/0.log" Mar 01 10:18:30 crc kubenswrapper[4792]: I0301 10:18:30.835590 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-79f8cb6d9d-xg7h5_d7f79f77-ac1b-445e-8e28-85c8964f5461/horizon/0.log" Mar 01 10:18:30 crc kubenswrapper[4792]: I0301 10:18:30.920163 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-79f8cb6d9d-xg7h5_d7f79f77-ac1b-445e-8e28-85c8964f5461/horizon-log/0.log" Mar 01 10:18:30 crc kubenswrapper[4792]: I0301 10:18:30.986508 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-xqprh_d11c64e6-0562-41d9-a213-f1c5749b4c83/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:18:31 crc kubenswrapper[4792]: I0301 10:18:31.162002 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-4rw28_822af429-9091-43e5-a16d-7a287f2c5bb2/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:18:31 crc kubenswrapper[4792]: I0301 10:18:31.346062 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29539321-sclgm_7ec04609-b280-4df0-a0c5-2e4c7208c1c6/keystone-cron/0.log" Mar 01 10:18:31 crc kubenswrapper[4792]: I0301 10:18:31.426784 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-749f685d77-ggsln_b60e7776-3e2a-4e08-900d-cd39a29a78bc/keystone-api/0.log" Mar 01 10:18:31 crc kubenswrapper[4792]: I0301 10:18:31.498133 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_6f21d62f-3539-4d5d-aeaa-cc816a51d412/kube-state-metrics/0.log" Mar 01 10:18:31 crc kubenswrapper[4792]: I0301 10:18:31.772545 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt_c7230f65-7e9a-4455-8d25-c49393bfbafe/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:18:31 crc kubenswrapper[4792]: I0301 10:18:31.818265 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_e6660fdc-5636-44ec-b6c0-e0e417d72e8a/manila-api-log/0.log" Mar 01 10:18:31 crc kubenswrapper[4792]: I0301 10:18:31.865614 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_e6660fdc-5636-44ec-b6c0-e0e417d72e8a/manila-api/0.log" Mar 01 10:18:32 crc kubenswrapper[4792]: I0301 10:18:32.060716 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_5813cf9a-1d9e-4a74-82e1-68e994c9175a/probe/0.log" Mar 01 10:18:32 crc kubenswrapper[4792]: I0301 10:18:32.149037 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_5813cf9a-1d9e-4a74-82e1-68e994c9175a/manila-scheduler/0.log" Mar 01 10:18:32 crc kubenswrapper[4792]: I0301 10:18:32.166699 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_03462f2f-874f-496a-934b-9fa6e2c55850/manila-share/0.log" Mar 01 10:18:32 crc kubenswrapper[4792]: I0301 10:18:32.257082 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_03462f2f-874f-496a-934b-9fa6e2c55850/probe/0.log" Mar 01 10:18:32 crc kubenswrapper[4792]: I0301 10:18:32.409617 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:18:32 crc kubenswrapper[4792]: E0301 10:18:32.409813 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:18:32 crc kubenswrapper[4792]: I0301 10:18:32.598858 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6c8bdfb955-kjg92_ceced30a-39e5-413f-a498-e5d4500f1eea/neutron-api/0.log" Mar 01 10:18:32 crc kubenswrapper[4792]: I0301 10:18:32.688700 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6c8bdfb955-kjg92_ceced30a-39e5-413f-a498-e5d4500f1eea/neutron-httpd/0.log" Mar 01 10:18:32 crc kubenswrapper[4792]: I0301 10:18:32.832112 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt_f737af00-5e6f-4a95-bf94-738b72990ebd/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:18:33 crc kubenswrapper[4792]: I0301 10:18:33.383038 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_f95aafcd-79b6-4ece-b3e1-ee9ea32a2754/nova-cell0-conductor-conductor/0.log" Mar 01 10:18:33 crc kubenswrapper[4792]: I0301 10:18:33.486292 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9c6de822-b7f5-4530-bb5b-ca879ff899fc/nova-api-log/0.log" Mar 01 10:18:33 crc kubenswrapper[4792]: I0301 10:18:33.700445 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9c6de822-b7f5-4530-bb5b-ca879ff899fc/nova-api-api/0.log" Mar 01 10:18:33 crc kubenswrapper[4792]: I0301 10:18:33.747712 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_9ef6cc4e-2fd6-403b-a163-638395c30672/nova-cell1-conductor-conductor/0.log" Mar 01 10:18:33 crc kubenswrapper[4792]: I0301 10:18:33.812200 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_63afaac7-c934-4410-b2b5-ab04ad085489/nova-cell1-novncproxy-novncproxy/0.log" Mar 01 10:18:34 crc kubenswrapper[4792]: I0301 10:18:34.076592 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7_d7776778-c586-4ab6-8fdf-bfed4168992d/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:18:34 crc kubenswrapper[4792]: I0301 10:18:34.194798 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_cbf9560f-212f-460a-9a4d-250e20b00d18/nova-metadata-log/0.log" Mar 01 10:18:34 crc kubenswrapper[4792]: I0301 10:18:34.834974 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_3a38c1a1-88bc-4bce-aea4-13e676aab111/nova-scheduler-scheduler/0.log" Mar 01 10:18:34 crc kubenswrapper[4792]: I0301 10:18:34.997360 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f2d03d42-7830-444b-a8ae-c91e16d352b9/mysql-bootstrap/0.log" Mar 01 10:18:35 crc kubenswrapper[4792]: I0301 10:18:35.215391 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f2d03d42-7830-444b-a8ae-c91e16d352b9/galera/0.log" Mar 01 10:18:35 crc kubenswrapper[4792]: I0301 10:18:35.247771 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f2d03d42-7830-444b-a8ae-c91e16d352b9/mysql-bootstrap/0.log" Mar 01 10:18:35 crc kubenswrapper[4792]: I0301 10:18:35.479401 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b969e6eb-14a7-4e45-8342-ccbd05c06261/mysql-bootstrap/0.log" Mar 01 10:18:35 crc kubenswrapper[4792]: I0301 10:18:35.706755 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b969e6eb-14a7-4e45-8342-ccbd05c06261/mysql-bootstrap/0.log" Mar 01 10:18:35 crc kubenswrapper[4792]: I0301 10:18:35.711389 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b969e6eb-14a7-4e45-8342-ccbd05c06261/galera/0.log" Mar 01 10:18:35 crc kubenswrapper[4792]: I0301 10:18:35.863359 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_cbf9560f-212f-460a-9a4d-250e20b00d18/nova-metadata-metadata/0.log" Mar 01 10:18:36 crc kubenswrapper[4792]: I0301 10:18:36.322749 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_fecafda6-dcf9-46ea-8678-8da499154ad7/openstackclient/0.log" Mar 01 10:18:36 crc kubenswrapper[4792]: I0301 10:18:36.427079 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-7wc55_9493aff0-58e3-44ca-ba01-69f3b284d732/openstack-network-exporter/0.log" Mar 01 10:18:36 crc kubenswrapper[4792]: I0301 10:18:36.662426 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-mpvqc_d50ee3b1-4f97-4644-802d-04c85d9c3abc/ovn-controller/0.log" Mar 01 10:18:36 crc kubenswrapper[4792]: I0301 10:18:36.846252 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-nfzrr_22d78adc-2ff6-4f03-b60e-ac8e9a0f3699/ovsdb-server-init/0.log" Mar 01 10:18:37 crc kubenswrapper[4792]: I0301 10:18:37.038981 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-nfzrr_22d78adc-2ff6-4f03-b60e-ac8e9a0f3699/ovsdb-server-init/0.log" Mar 01 10:18:37 crc kubenswrapper[4792]: I0301 10:18:37.044380 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-nfzrr_22d78adc-2ff6-4f03-b60e-ac8e9a0f3699/ovs-vswitchd/0.log" Mar 01 10:18:37 crc kubenswrapper[4792]: I0301 10:18:37.116227 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-nfzrr_22d78adc-2ff6-4f03-b60e-ac8e9a0f3699/ovsdb-server/0.log" Mar 01 10:18:37 crc kubenswrapper[4792]: I0301 10:18:37.381147 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_1712e112-23fd-402b-ae0b-f63a594d4fab/openstack-network-exporter/0.log" Mar 01 10:18:37 crc kubenswrapper[4792]: I0301 10:18:37.392629 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-bc5rl_e4b8a64b-6bea-426c-b1f5-2372342d4211/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:18:37 crc kubenswrapper[4792]: I0301 10:18:37.874766 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_1712e112-23fd-402b-ae0b-f63a594d4fab/ovn-northd/0.log" Mar 01 10:18:37 crc kubenswrapper[4792]: I0301 10:18:37.896867 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_2c9312b5-705e-42f0-8462-62c8fdeb0791/ovsdbserver-nb/0.log" Mar 01 10:18:37 crc kubenswrapper[4792]: I0301 10:18:37.976836 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_2c9312b5-705e-42f0-8462-62c8fdeb0791/openstack-network-exporter/0.log" Mar 01 10:18:38 crc kubenswrapper[4792]: I0301 10:18:38.183381 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_a20f7417-3c04-411a-88b9-d60664faaee3/openstack-network-exporter/0.log" Mar 01 10:18:38 crc kubenswrapper[4792]: I0301 10:18:38.208006 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_a20f7417-3c04-411a-88b9-d60664faaee3/ovsdbserver-sb/0.log" Mar 01 10:18:38 crc kubenswrapper[4792]: I0301 10:18:38.482890 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-84f9696594-qdwsv_18f9e703-dec0-46e1-a428-580bdb68e54e/placement-api/0.log" Mar 01 10:18:38 crc kubenswrapper[4792]: I0301 10:18:38.665177 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e0e1dd7a-6a53-446d-bf90-5813f7a3fda0/setup-container/0.log" Mar 01 10:18:38 crc kubenswrapper[4792]: I0301 10:18:38.701155 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-84f9696594-qdwsv_18f9e703-dec0-46e1-a428-580bdb68e54e/placement-log/0.log" Mar 01 10:18:38 crc kubenswrapper[4792]: I0301 10:18:38.831358 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e0e1dd7a-6a53-446d-bf90-5813f7a3fda0/setup-container/0.log" Mar 01 10:18:38 crc kubenswrapper[4792]: I0301 10:18:38.958782 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e0e1dd7a-6a53-446d-bf90-5813f7a3fda0/rabbitmq/0.log" Mar 01 10:18:39 crc kubenswrapper[4792]: I0301 10:18:39.054895 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_63658b27-63d9-4a0f-afca-3a3c245b9b9d/setup-container/0.log" Mar 01 10:18:39 crc kubenswrapper[4792]: I0301 10:18:39.212971 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_63658b27-63d9-4a0f-afca-3a3c245b9b9d/setup-container/0.log" Mar 01 10:18:39 crc kubenswrapper[4792]: I0301 10:18:39.415013 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_63658b27-63d9-4a0f-afca-3a3c245b9b9d/rabbitmq/0.log" Mar 01 10:18:39 crc kubenswrapper[4792]: I0301 10:18:39.443658 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d_34275228-a1ab-4955-9d16-d184643a86d1/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:18:39 crc kubenswrapper[4792]: I0301 10:18:39.497132 4792 scope.go:117] "RemoveContainer" containerID="92df935b0380167f616deb42823e3a7a1744b9e85454d7327ad662b50fab1963" Mar 01 10:18:39 crc kubenswrapper[4792]: I0301 10:18:39.781250 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64_6c517000-6918-4f58-871b-7c4d26197ccf/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:18:39 crc kubenswrapper[4792]: I0301 10:18:39.880896 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-gxxr7_ff733b23-0a97-4623-9eeb-339aa02fc3b0/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:18:40 crc kubenswrapper[4792]: I0301 10:18:40.118344 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-8k5rj_ac58ff00-ba74-492a-97f1-e72c56686f1d/ssh-known-hosts-edpm-deployment/0.log" Mar 01 10:18:40 crc kubenswrapper[4792]: I0301 10:18:40.201232 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_ee1c75ce-61f7-4ce5-a757-b7405d7135bd/tempest-tests-tempest-tests-runner/0.log" Mar 01 10:18:40 crc kubenswrapper[4792]: I0301 10:18:40.514626 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_478d8531-4e8e-4775-999d-42af4afef106/test-operator-logs-container/0.log" Mar 01 10:18:40 crc kubenswrapper[4792]: I0301 10:18:40.660033 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-phn2l_59b987d8-9463-48cb-9651-1e5cb16aa764/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:18:47 crc kubenswrapper[4792]: I0301 10:18:47.412945 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:18:48 crc kubenswrapper[4792]: I0301 10:18:48.414316 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"34d9173341b46ccf37e4f77b26afd17d6e6d0f7b2699af960d32ad54ab5e3db7"} Mar 01 10:18:52 crc kubenswrapper[4792]: I0301 10:18:52.827559 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_84d455ad-7bbb-4771-a8ed-9aa1984e1d40/memcached/0.log" Mar 01 10:19:11 crc kubenswrapper[4792]: I0301 10:19:11.774099 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2_d4447fd9-d2df-47f4-a94f-ff8b4c5080bd/util/0.log" Mar 01 10:19:11 crc kubenswrapper[4792]: I0301 10:19:11.978148 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2_d4447fd9-d2df-47f4-a94f-ff8b4c5080bd/util/0.log" Mar 01 10:19:12 crc kubenswrapper[4792]: I0301 10:19:12.034134 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2_d4447fd9-d2df-47f4-a94f-ff8b4c5080bd/pull/0.log" Mar 01 10:19:12 crc kubenswrapper[4792]: I0301 10:19:12.063025 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2_d4447fd9-d2df-47f4-a94f-ff8b4c5080bd/pull/0.log" Mar 01 10:19:12 crc kubenswrapper[4792]: I0301 10:19:12.234343 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2_d4447fd9-d2df-47f4-a94f-ff8b4c5080bd/pull/0.log" Mar 01 10:19:12 crc kubenswrapper[4792]: I0301 10:19:12.284273 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2_d4447fd9-d2df-47f4-a94f-ff8b4c5080bd/util/0.log" Mar 01 10:19:12 crc kubenswrapper[4792]: I0301 10:19:12.306449 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2_d4447fd9-d2df-47f4-a94f-ff8b4c5080bd/extract/0.log" Mar 01 10:19:12 crc kubenswrapper[4792]: I0301 10:19:12.790687 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-5d87c9d997-72srw_bf1f37ea-a566-4dfd-b45b-02f284f19ce3/manager/0.log" Mar 01 10:19:13 crc kubenswrapper[4792]: I0301 10:19:13.425980 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-64db6967f8-9wzbh_02dd5cc0-c44b-4ede-972b-9d26c9c54100/manager/0.log" Mar 01 10:19:13 crc kubenswrapper[4792]: I0301 10:19:13.649749 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-cf99c678f-7v65r_5044cf86-f557-41d4-b6c0-a41a668ac999/manager/0.log" Mar 01 10:19:13 crc kubenswrapper[4792]: I0301 10:19:13.931099 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-78bc7f9bd9-55qzx_cd83ed19-023d-43c2-92db-d290499db3d4/manager/0.log" Mar 01 10:19:14 crc kubenswrapper[4792]: I0301 10:19:14.552388 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-545456dc4-jvw5j_2af4993f-9ba9-4f7a-a31e-2bd133a7d4c5/manager/0.log" Mar 01 10:19:14 crc kubenswrapper[4792]: I0301 10:19:14.654998 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-jlnsb_8741a141-0194-4eb2-956e-c41f4ffe1338/manager/0.log" Mar 01 10:19:14 crc kubenswrapper[4792]: I0301 10:19:14.735588 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-f7fcc58b9-dsqtf_ea6739c2-185a-43e7-8fcf-0b2ae31957a0/manager/0.log" Mar 01 10:19:15 crc kubenswrapper[4792]: I0301 10:19:15.100673 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-t5fsn_376afe52-646d-44b7-b32e-ce6cd6dc21a6/manager/0.log" Mar 01 10:19:15 crc kubenswrapper[4792]: I0301 10:19:15.144592 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7c789f89c6-wjf62_234d2ae5-7589-44cc-83f4-b0ee8a91940a/manager/0.log" Mar 01 10:19:15 crc kubenswrapper[4792]: I0301 10:19:15.506870 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-7b6bfb6475-hlzm6_1793465e-1273-4250-a238-c99798788618/manager/0.log" Mar 01 10:19:15 crc kubenswrapper[4792]: I0301 10:19:15.676140 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54688575f-qjqd2_dfb10d33-c4f1-4287-be83-dff835c733ba/manager/0.log" Mar 01 10:19:15 crc kubenswrapper[4792]: I0301 10:19:15.924476 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5d86c7ddb7-54rpl_ecc17c18-7695-4d22-9a95-bcac51800d60/manager/0.log" Mar 01 10:19:15 crc kubenswrapper[4792]: I0301 10:19:15.970363 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-74b6b5dc96-knk7m_8307ba19-5fc4-4cfc-b3cd-cafe5eac9cb9/manager/0.log" Mar 01 10:19:16 crc kubenswrapper[4792]: I0301 10:19:16.160805 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7b4cc4776948grv_9244686e-175e-45f9-9eb7-23621cd1f3cd/manager/0.log" Mar 01 10:19:16 crc kubenswrapper[4792]: I0301 10:19:16.512480 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-595c94944c-vtchh_c967e6f5-6388-4ae5-9ccf-379b6305e1b0/operator/0.log" Mar 01 10:19:16 crc kubenswrapper[4792]: I0301 10:19:16.687033 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-5kfk4_dc22117a-72a7-4838-bb1c-111e91514b98/registry-server/0.log" Mar 01 10:19:16 crc kubenswrapper[4792]: I0301 10:19:16.925931 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-75684d597f-zkx7c_3d38195c-e4ff-49cf-9592-e9f52d73f2df/manager/0.log" Mar 01 10:19:17 crc kubenswrapper[4792]: I0301 10:19:17.006978 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-648564c9fc-jdn6k_808b8753-0a20-419b-8b04-dcbccaa2d77e/manager/0.log" Mar 01 10:19:17 crc kubenswrapper[4792]: I0301 10:19:17.264892 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-r5l9m_1ecd6b07-eda9-41d6-90af-6471699ff808/operator/0.log" Mar 01 10:19:17 crc kubenswrapper[4792]: I0301 10:19:17.468296 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9b9ff9f4d-mqndr_e0cef8e2-a392-4612-97c6-17c611b2a44e/manager/0.log" Mar 01 10:19:17 crc kubenswrapper[4792]: I0301 10:19:17.738403 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-55b5ff4dbb-bcnns_2970c60c-7b03-4667-99e4-08c094cdbfc2/manager/0.log" Mar 01 10:19:17 crc kubenswrapper[4792]: I0301 10:19:17.764101 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5fdb694969-jpxwz_4fe8270e-a46d-40bc-8d24-a4585b196f5e/manager/0.log" Mar 01 10:19:18 crc kubenswrapper[4792]: I0301 10:19:18.117660 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-64lkf_e45ebab9-87d5-4b2f-b3d1-f1832864584d/manager/0.log" Mar 01 10:19:18 crc kubenswrapper[4792]: I0301 10:19:18.329718 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-864b865b94-5ndlx_d1d3783f-78e9-461a-916a-5a46e3083e70/manager/0.log" Mar 01 10:19:22 crc kubenswrapper[4792]: I0301 10:19:22.143987 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6db6876945-ggspg_b9e3fd6b-e3e2-4380-b8d7-900891df562a/manager/0.log" Mar 01 10:19:41 crc kubenswrapper[4792]: I0301 10:19:41.543032 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-9smfd_e0b63d94-59de-45da-8058-89714bea7a90/control-plane-machine-set-operator/0.log" Mar 01 10:19:41 crc kubenswrapper[4792]: I0301 10:19:41.725515 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-nv4bp_51683a24-edad-4808-b2ec-6a628bfdd937/kube-rbac-proxy/0.log" Mar 01 10:19:41 crc kubenswrapper[4792]: I0301 10:19:41.806547 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-nv4bp_51683a24-edad-4808-b2ec-6a628bfdd937/machine-api-operator/0.log" Mar 01 10:19:57 crc kubenswrapper[4792]: I0301 10:19:57.645398 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-tm5s6_2071887a-31a9-428d-92d0-bf8a361011ca/cert-manager-cainjector/0.log" Mar 01 10:19:57 crc kubenswrapper[4792]: I0301 10:19:57.674241 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-4qgsm_bf71ada0-c7b2-4255-bb2c-31ec3309a29d/cert-manager-controller/0.log" Mar 01 10:19:57 crc kubenswrapper[4792]: I0301 10:19:57.930124 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-rckpb_a03eedd4-ecde-4905-95a7-c43b45ef9da9/cert-manager-webhook/0.log" Mar 01 10:20:00 crc kubenswrapper[4792]: I0301 10:20:00.147171 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539340-rjnp9"] Mar 01 10:20:00 crc kubenswrapper[4792]: E0301 10:20:00.148087 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a36fce8b-7564-4d74-b3ad-9bfe9979cb67" containerName="oc" Mar 01 10:20:00 crc kubenswrapper[4792]: I0301 10:20:00.148100 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a36fce8b-7564-4d74-b3ad-9bfe9979cb67" containerName="oc" Mar 01 10:20:00 crc kubenswrapper[4792]: I0301 10:20:00.148293 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a36fce8b-7564-4d74-b3ad-9bfe9979cb67" containerName="oc" Mar 01 10:20:00 crc kubenswrapper[4792]: I0301 10:20:00.148930 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539340-rjnp9" Mar 01 10:20:00 crc kubenswrapper[4792]: I0301 10:20:00.153441 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:20:00 crc kubenswrapper[4792]: I0301 10:20:00.153454 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:20:00 crc kubenswrapper[4792]: I0301 10:20:00.153449 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:20:00 crc kubenswrapper[4792]: I0301 10:20:00.159824 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539340-rjnp9"] Mar 01 10:20:00 crc kubenswrapper[4792]: I0301 10:20:00.325787 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wjxw\" (UniqueName: \"kubernetes.io/projected/5d79ac35-053d-480b-a8ef-3b03122b0152-kube-api-access-8wjxw\") pod \"auto-csr-approver-29539340-rjnp9\" (UID: \"5d79ac35-053d-480b-a8ef-3b03122b0152\") " pod="openshift-infra/auto-csr-approver-29539340-rjnp9" Mar 01 10:20:00 crc kubenswrapper[4792]: I0301 10:20:00.428449 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wjxw\" (UniqueName: \"kubernetes.io/projected/5d79ac35-053d-480b-a8ef-3b03122b0152-kube-api-access-8wjxw\") pod \"auto-csr-approver-29539340-rjnp9\" (UID: \"5d79ac35-053d-480b-a8ef-3b03122b0152\") " pod="openshift-infra/auto-csr-approver-29539340-rjnp9" Mar 01 10:20:00 crc kubenswrapper[4792]: I0301 10:20:00.451724 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wjxw\" (UniqueName: \"kubernetes.io/projected/5d79ac35-053d-480b-a8ef-3b03122b0152-kube-api-access-8wjxw\") pod \"auto-csr-approver-29539340-rjnp9\" (UID: \"5d79ac35-053d-480b-a8ef-3b03122b0152\") " pod="openshift-infra/auto-csr-approver-29539340-rjnp9" Mar 01 10:20:00 crc kubenswrapper[4792]: I0301 10:20:00.472629 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539340-rjnp9" Mar 01 10:20:01 crc kubenswrapper[4792]: I0301 10:20:01.066040 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 01 10:20:01 crc kubenswrapper[4792]: I0301 10:20:01.075157 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539340-rjnp9"] Mar 01 10:20:02 crc kubenswrapper[4792]: I0301 10:20:02.042355 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539340-rjnp9" event={"ID":"5d79ac35-053d-480b-a8ef-3b03122b0152","Type":"ContainerStarted","Data":"993ecebc9f66b56082a2c6aad384afbd8b43e4f69e9684b2bc92daa3f1248dd6"} Mar 01 10:20:03 crc kubenswrapper[4792]: I0301 10:20:03.050755 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539340-rjnp9" event={"ID":"5d79ac35-053d-480b-a8ef-3b03122b0152","Type":"ContainerStarted","Data":"fd47f4e8c316ceef6350038b81becae73fa982c181d0ebd8621bb407bf6fb4b7"} Mar 01 10:20:03 crc kubenswrapper[4792]: I0301 10:20:03.066639 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29539340-rjnp9" podStartSLOduration=1.995481039 podStartE2EDuration="3.066623723s" podCreationTimestamp="2026-03-01 10:20:00 +0000 UTC" firstStartedPulling="2026-03-01 10:20:01.065730643 +0000 UTC m=+4330.307609850" lastFinishedPulling="2026-03-01 10:20:02.136873337 +0000 UTC m=+4331.378752534" observedRunningTime="2026-03-01 10:20:03.065085625 +0000 UTC m=+4332.306964832" watchObservedRunningTime="2026-03-01 10:20:03.066623723 +0000 UTC m=+4332.308502920" Mar 01 10:20:04 crc kubenswrapper[4792]: I0301 10:20:04.060163 4792 generic.go:334] "Generic (PLEG): container finished" podID="5d79ac35-053d-480b-a8ef-3b03122b0152" containerID="fd47f4e8c316ceef6350038b81becae73fa982c181d0ebd8621bb407bf6fb4b7" exitCode=0 Mar 01 10:20:04 crc kubenswrapper[4792]: I0301 10:20:04.060216 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539340-rjnp9" event={"ID":"5d79ac35-053d-480b-a8ef-3b03122b0152","Type":"ContainerDied","Data":"fd47f4e8c316ceef6350038b81becae73fa982c181d0ebd8621bb407bf6fb4b7"} Mar 01 10:20:05 crc kubenswrapper[4792]: I0301 10:20:05.402669 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539340-rjnp9" Mar 01 10:20:05 crc kubenswrapper[4792]: I0301 10:20:05.537438 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wjxw\" (UniqueName: \"kubernetes.io/projected/5d79ac35-053d-480b-a8ef-3b03122b0152-kube-api-access-8wjxw\") pod \"5d79ac35-053d-480b-a8ef-3b03122b0152\" (UID: \"5d79ac35-053d-480b-a8ef-3b03122b0152\") " Mar 01 10:20:05 crc kubenswrapper[4792]: I0301 10:20:05.543865 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d79ac35-053d-480b-a8ef-3b03122b0152-kube-api-access-8wjxw" (OuterVolumeSpecName: "kube-api-access-8wjxw") pod "5d79ac35-053d-480b-a8ef-3b03122b0152" (UID: "5d79ac35-053d-480b-a8ef-3b03122b0152"). InnerVolumeSpecName "kube-api-access-8wjxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:20:05 crc kubenswrapper[4792]: I0301 10:20:05.639486 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wjxw\" (UniqueName: \"kubernetes.io/projected/5d79ac35-053d-480b-a8ef-3b03122b0152-kube-api-access-8wjxw\") on node \"crc\" DevicePath \"\"" Mar 01 10:20:06 crc kubenswrapper[4792]: I0301 10:20:06.079861 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539340-rjnp9" event={"ID":"5d79ac35-053d-480b-a8ef-3b03122b0152","Type":"ContainerDied","Data":"993ecebc9f66b56082a2c6aad384afbd8b43e4f69e9684b2bc92daa3f1248dd6"} Mar 01 10:20:06 crc kubenswrapper[4792]: I0301 10:20:06.079926 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="993ecebc9f66b56082a2c6aad384afbd8b43e4f69e9684b2bc92daa3f1248dd6" Mar 01 10:20:06 crc kubenswrapper[4792]: I0301 10:20:06.079931 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539340-rjnp9" Mar 01 10:20:06 crc kubenswrapper[4792]: I0301 10:20:06.141964 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539334-c4cd2"] Mar 01 10:20:06 crc kubenswrapper[4792]: I0301 10:20:06.150460 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539334-c4cd2"] Mar 01 10:20:07 crc kubenswrapper[4792]: I0301 10:20:07.422197 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="807c2da3-ef0e-4e89-9457-37401354a8e9" path="/var/lib/kubelet/pods/807c2da3-ef0e-4e89-9457-37401354a8e9/volumes" Mar 01 10:20:14 crc kubenswrapper[4792]: I0301 10:20:14.290216 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-mtxkm_f7ca92c8-f38b-4a0a-b330-5809993cbb49/nmstate-console-plugin/0.log" Mar 01 10:20:14 crc kubenswrapper[4792]: I0301 10:20:14.404339 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-9j2tz_7105919f-ddac-45db-a8f7-bd927e5737df/nmstate-handler/0.log" Mar 01 10:20:14 crc kubenswrapper[4792]: I0301 10:20:14.664895 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-97cv9_bfe2cc56-28ca-4201-ba5a-4208dd1ec818/kube-rbac-proxy/0.log" Mar 01 10:20:14 crc kubenswrapper[4792]: I0301 10:20:14.867389 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-97cv9_bfe2cc56-28ca-4201-ba5a-4208dd1ec818/nmstate-metrics/0.log" Mar 01 10:20:15 crc kubenswrapper[4792]: I0301 10:20:15.014212 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-chfpw_fb942d1c-2a1a-4265-ae29-02f185d4cc40/nmstate-operator/0.log" Mar 01 10:20:15 crc kubenswrapper[4792]: I0301 10:20:15.054129 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-zwhpc_aa2300d6-10c0-4dc9-812a-fcb30f09920e/nmstate-webhook/0.log" Mar 01 10:20:39 crc kubenswrapper[4792]: I0301 10:20:39.685928 4792 scope.go:117] "RemoveContainer" containerID="b30f740be1f0c17125e3d3e62c51a8c03429e4e9dd6640d5fa13fe74408f6823" Mar 01 10:20:44 crc kubenswrapper[4792]: I0301 10:20:44.158994 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-twxml_f73a6813-31ea-4018-bd23-45bf2f1dfe89/kube-rbac-proxy/0.log" Mar 01 10:20:44 crc kubenswrapper[4792]: I0301 10:20:44.287631 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-twxml_f73a6813-31ea-4018-bd23-45bf2f1dfe89/controller/0.log" Mar 01 10:20:44 crc kubenswrapper[4792]: I0301 10:20:44.481750 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-frr-files/0.log" Mar 01 10:20:44 crc kubenswrapper[4792]: I0301 10:20:44.663608 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-metrics/0.log" Mar 01 10:20:44 crc kubenswrapper[4792]: I0301 10:20:44.679855 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-frr-files/0.log" Mar 01 10:20:44 crc kubenswrapper[4792]: I0301 10:20:44.686360 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-reloader/0.log" Mar 01 10:20:44 crc kubenswrapper[4792]: I0301 10:20:44.732892 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-reloader/0.log" Mar 01 10:20:44 crc kubenswrapper[4792]: I0301 10:20:44.972231 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-reloader/0.log" Mar 01 10:20:44 crc kubenswrapper[4792]: I0301 10:20:44.995984 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-frr-files/0.log" Mar 01 10:20:44 crc kubenswrapper[4792]: I0301 10:20:44.998268 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-metrics/0.log" Mar 01 10:20:45 crc kubenswrapper[4792]: I0301 10:20:45.061427 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-metrics/0.log" Mar 01 10:20:45 crc kubenswrapper[4792]: I0301 10:20:45.181101 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-metrics/0.log" Mar 01 10:20:45 crc kubenswrapper[4792]: I0301 10:20:45.186357 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-frr-files/0.log" Mar 01 10:20:45 crc kubenswrapper[4792]: I0301 10:20:45.212327 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-reloader/0.log" Mar 01 10:20:45 crc kubenswrapper[4792]: I0301 10:20:45.283113 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/controller/0.log" Mar 01 10:20:45 crc kubenswrapper[4792]: I0301 10:20:45.425652 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/frr-metrics/0.log" Mar 01 10:20:45 crc kubenswrapper[4792]: I0301 10:20:45.477697 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/kube-rbac-proxy/0.log" Mar 01 10:20:45 crc kubenswrapper[4792]: I0301 10:20:45.543041 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/kube-rbac-proxy-frr/0.log" Mar 01 10:20:45 crc kubenswrapper[4792]: I0301 10:20:45.709524 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/reloader/0.log" Mar 01 10:20:45 crc kubenswrapper[4792]: I0301 10:20:45.819253 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-kfnzk_d2f0572c-e661-495c-873c-6e2d18f2ab7d/frr-k8s-webhook-server/0.log" Mar 01 10:20:46 crc kubenswrapper[4792]: I0301 10:20:46.090329 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5cd84fcfbc-lrpmz_ba22e25a-31e8-4ca7-b169-f7433eda818b/manager/0.log" Mar 01 10:20:46 crc kubenswrapper[4792]: I0301 10:20:46.308552 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-776c7d78bd-jwfh6_cf86866e-8afa-44da-a688-e1c018a025bd/webhook-server/0.log" Mar 01 10:20:46 crc kubenswrapper[4792]: I0301 10:20:46.322564 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zpr27_8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7/kube-rbac-proxy/0.log" Mar 01 10:20:47 crc kubenswrapper[4792]: I0301 10:20:47.033033 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zpr27_8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7/speaker/0.log" Mar 01 10:20:47 crc kubenswrapper[4792]: I0301 10:20:47.063113 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/frr/0.log" Mar 01 10:21:01 crc kubenswrapper[4792]: I0301 10:21:01.914890 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd/util/0.log" Mar 01 10:21:02 crc kubenswrapper[4792]: I0301 10:21:02.133283 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd/pull/0.log" Mar 01 10:21:02 crc kubenswrapper[4792]: I0301 10:21:02.180310 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd/pull/0.log" Mar 01 10:21:02 crc kubenswrapper[4792]: I0301 10:21:02.180391 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd/util/0.log" Mar 01 10:21:02 crc kubenswrapper[4792]: I0301 10:21:02.331018 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd/util/0.log" Mar 01 10:21:02 crc kubenswrapper[4792]: I0301 10:21:02.347491 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd/pull/0.log" Mar 01 10:21:02 crc kubenswrapper[4792]: I0301 10:21:02.356154 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd/extract/0.log" Mar 01 10:21:02 crc kubenswrapper[4792]: I0301 10:21:02.599684 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sksw8_55c448d6-e926-4b07-8aec-8195d42d2e30/extract-utilities/0.log" Mar 01 10:21:02 crc kubenswrapper[4792]: I0301 10:21:02.711537 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sksw8_55c448d6-e926-4b07-8aec-8195d42d2e30/extract-utilities/0.log" Mar 01 10:21:02 crc kubenswrapper[4792]: I0301 10:21:02.738983 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sksw8_55c448d6-e926-4b07-8aec-8195d42d2e30/extract-content/0.log" Mar 01 10:21:02 crc kubenswrapper[4792]: I0301 10:21:02.761365 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sksw8_55c448d6-e926-4b07-8aec-8195d42d2e30/extract-content/0.log" Mar 01 10:21:02 crc kubenswrapper[4792]: I0301 10:21:02.933293 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sksw8_55c448d6-e926-4b07-8aec-8195d42d2e30/extract-content/0.log" Mar 01 10:21:02 crc kubenswrapper[4792]: I0301 10:21:02.940275 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sksw8_55c448d6-e926-4b07-8aec-8195d42d2e30/extract-utilities/0.log" Mar 01 10:21:03 crc kubenswrapper[4792]: I0301 10:21:03.132081 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-48zdf_d875f1af-e90b-4882-b472-f91651d468a6/extract-utilities/0.log" Mar 01 10:21:03 crc kubenswrapper[4792]: I0301 10:21:03.389050 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sksw8_55c448d6-e926-4b07-8aec-8195d42d2e30/registry-server/0.log" Mar 01 10:21:03 crc kubenswrapper[4792]: I0301 10:21:03.405112 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-48zdf_d875f1af-e90b-4882-b472-f91651d468a6/extract-content/0.log" Mar 01 10:21:03 crc kubenswrapper[4792]: I0301 10:21:03.427325 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-48zdf_d875f1af-e90b-4882-b472-f91651d468a6/extract-content/0.log" Mar 01 10:21:03 crc kubenswrapper[4792]: I0301 10:21:03.469015 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-48zdf_d875f1af-e90b-4882-b472-f91651d468a6/extract-utilities/0.log" Mar 01 10:21:03 crc kubenswrapper[4792]: I0301 10:21:03.685201 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-48zdf_d875f1af-e90b-4882-b472-f91651d468a6/extract-utilities/0.log" Mar 01 10:21:03 crc kubenswrapper[4792]: I0301 10:21:03.714729 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-48zdf_d875f1af-e90b-4882-b472-f91651d468a6/extract-content/0.log" Mar 01 10:21:04 crc kubenswrapper[4792]: I0301 10:21:04.089516 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t_a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4/util/0.log" Mar 01 10:21:04 crc kubenswrapper[4792]: I0301 10:21:04.206420 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t_a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4/pull/0.log" Mar 01 10:21:04 crc kubenswrapper[4792]: I0301 10:21:04.367002 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t_a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4/util/0.log" Mar 01 10:21:04 crc kubenswrapper[4792]: I0301 10:21:04.390128 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t_a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4/pull/0.log" Mar 01 10:21:04 crc kubenswrapper[4792]: I0301 10:21:04.479782 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-48zdf_d875f1af-e90b-4882-b472-f91651d468a6/registry-server/0.log" Mar 01 10:21:04 crc kubenswrapper[4792]: I0301 10:21:04.711818 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t_a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4/util/0.log" Mar 01 10:21:04 crc kubenswrapper[4792]: I0301 10:21:04.712338 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t_a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4/pull/0.log" Mar 01 10:21:04 crc kubenswrapper[4792]: I0301 10:21:04.716816 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t_a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4/extract/0.log" Mar 01 10:21:04 crc kubenswrapper[4792]: I0301 10:21:04.943307 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:21:04 crc kubenswrapper[4792]: I0301 10:21:04.943361 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:21:05 crc kubenswrapper[4792]: I0301 10:21:05.013935 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-gfkbs_46fe59e7-8122-4621-ae8d-237a91daee5e/marketplace-operator/0.log" Mar 01 10:21:05 crc kubenswrapper[4792]: I0301 10:21:05.033475 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zjvjw_3003e690-c3dd-4236-a95c-a0fb6ccb438e/extract-utilities/0.log" Mar 01 10:21:05 crc kubenswrapper[4792]: I0301 10:21:05.251791 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zjvjw_3003e690-c3dd-4236-a95c-a0fb6ccb438e/extract-utilities/0.log" Mar 01 10:21:05 crc kubenswrapper[4792]: I0301 10:21:05.297588 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zjvjw_3003e690-c3dd-4236-a95c-a0fb6ccb438e/extract-content/0.log" Mar 01 10:21:05 crc kubenswrapper[4792]: I0301 10:21:05.371467 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zjvjw_3003e690-c3dd-4236-a95c-a0fb6ccb438e/extract-content/0.log" Mar 01 10:21:05 crc kubenswrapper[4792]: I0301 10:21:05.488325 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zjvjw_3003e690-c3dd-4236-a95c-a0fb6ccb438e/extract-content/0.log" Mar 01 10:21:05 crc kubenswrapper[4792]: I0301 10:21:05.566175 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zjvjw_3003e690-c3dd-4236-a95c-a0fb6ccb438e/extract-utilities/0.log" Mar 01 10:21:05 crc kubenswrapper[4792]: I0301 10:21:05.672106 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zjvjw_3003e690-c3dd-4236-a95c-a0fb6ccb438e/registry-server/0.log" Mar 01 10:21:05 crc kubenswrapper[4792]: I0301 10:21:05.716410 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7fdqh_38bc0c09-286e-427a-95c2-8e2c9213b142/extract-utilities/0.log" Mar 01 10:21:05 crc kubenswrapper[4792]: I0301 10:21:05.898783 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7fdqh_38bc0c09-286e-427a-95c2-8e2c9213b142/extract-content/0.log" Mar 01 10:21:05 crc kubenswrapper[4792]: I0301 10:21:05.907567 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7fdqh_38bc0c09-286e-427a-95c2-8e2c9213b142/extract-utilities/0.log" Mar 01 10:21:05 crc kubenswrapper[4792]: I0301 10:21:05.949239 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7fdqh_38bc0c09-286e-427a-95c2-8e2c9213b142/extract-content/0.log" Mar 01 10:21:06 crc kubenswrapper[4792]: I0301 10:21:06.089452 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7fdqh_38bc0c09-286e-427a-95c2-8e2c9213b142/extract-utilities/0.log" Mar 01 10:21:06 crc kubenswrapper[4792]: I0301 10:21:06.109349 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7fdqh_38bc0c09-286e-427a-95c2-8e2c9213b142/extract-content/0.log" Mar 01 10:21:06 crc kubenswrapper[4792]: I0301 10:21:06.621486 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7fdqh_38bc0c09-286e-427a-95c2-8e2c9213b142/registry-server/0.log" Mar 01 10:21:34 crc kubenswrapper[4792]: I0301 10:21:34.943609 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:21:34 crc kubenswrapper[4792]: I0301 10:21:34.944118 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:22:00 crc kubenswrapper[4792]: I0301 10:22:00.140619 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539342-5t24p"] Mar 01 10:22:00 crc kubenswrapper[4792]: E0301 10:22:00.141602 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d79ac35-053d-480b-a8ef-3b03122b0152" containerName="oc" Mar 01 10:22:00 crc kubenswrapper[4792]: I0301 10:22:00.141619 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d79ac35-053d-480b-a8ef-3b03122b0152" containerName="oc" Mar 01 10:22:00 crc kubenswrapper[4792]: I0301 10:22:00.141815 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d79ac35-053d-480b-a8ef-3b03122b0152" containerName="oc" Mar 01 10:22:00 crc kubenswrapper[4792]: I0301 10:22:00.142515 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539342-5t24p" Mar 01 10:22:00 crc kubenswrapper[4792]: I0301 10:22:00.144549 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:22:00 crc kubenswrapper[4792]: I0301 10:22:00.144986 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:22:00 crc kubenswrapper[4792]: I0301 10:22:00.149529 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:22:00 crc kubenswrapper[4792]: I0301 10:22:00.160640 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539342-5t24p"] Mar 01 10:22:00 crc kubenswrapper[4792]: I0301 10:22:00.322194 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tjsj\" (UniqueName: \"kubernetes.io/projected/821a550c-e6ff-4517-a306-11ea497be759-kube-api-access-2tjsj\") pod \"auto-csr-approver-29539342-5t24p\" (UID: \"821a550c-e6ff-4517-a306-11ea497be759\") " pod="openshift-infra/auto-csr-approver-29539342-5t24p" Mar 01 10:22:00 crc kubenswrapper[4792]: I0301 10:22:00.424060 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tjsj\" (UniqueName: \"kubernetes.io/projected/821a550c-e6ff-4517-a306-11ea497be759-kube-api-access-2tjsj\") pod \"auto-csr-approver-29539342-5t24p\" (UID: \"821a550c-e6ff-4517-a306-11ea497be759\") " pod="openshift-infra/auto-csr-approver-29539342-5t24p" Mar 01 10:22:00 crc kubenswrapper[4792]: I0301 10:22:00.449299 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tjsj\" (UniqueName: \"kubernetes.io/projected/821a550c-e6ff-4517-a306-11ea497be759-kube-api-access-2tjsj\") pod \"auto-csr-approver-29539342-5t24p\" (UID: \"821a550c-e6ff-4517-a306-11ea497be759\") " pod="openshift-infra/auto-csr-approver-29539342-5t24p" Mar 01 10:22:00 crc kubenswrapper[4792]: I0301 10:22:00.480245 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539342-5t24p" Mar 01 10:22:00 crc kubenswrapper[4792]: I0301 10:22:00.965667 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539342-5t24p"] Mar 01 10:22:01 crc kubenswrapper[4792]: I0301 10:22:01.209436 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539342-5t24p" event={"ID":"821a550c-e6ff-4517-a306-11ea497be759","Type":"ContainerStarted","Data":"f117d9ce2e607aa3abf587259dd00adb7c06fd9684234b0efa083d157e7c032e"} Mar 01 10:22:03 crc kubenswrapper[4792]: I0301 10:22:03.237026 4792 generic.go:334] "Generic (PLEG): container finished" podID="821a550c-e6ff-4517-a306-11ea497be759" containerID="ec9227dfee7817fbe5632f2b037665aa0557b7bf6988989846a2502b4704a463" exitCode=0 Mar 01 10:22:03 crc kubenswrapper[4792]: I0301 10:22:03.237286 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539342-5t24p" event={"ID":"821a550c-e6ff-4517-a306-11ea497be759","Type":"ContainerDied","Data":"ec9227dfee7817fbe5632f2b037665aa0557b7bf6988989846a2502b4704a463"} Mar 01 10:22:04 crc kubenswrapper[4792]: I0301 10:22:04.866345 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539342-5t24p" Mar 01 10:22:04 crc kubenswrapper[4792]: I0301 10:22:04.923556 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tjsj\" (UniqueName: \"kubernetes.io/projected/821a550c-e6ff-4517-a306-11ea497be759-kube-api-access-2tjsj\") pod \"821a550c-e6ff-4517-a306-11ea497be759\" (UID: \"821a550c-e6ff-4517-a306-11ea497be759\") " Mar 01 10:22:04 crc kubenswrapper[4792]: I0301 10:22:04.956140 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/821a550c-e6ff-4517-a306-11ea497be759-kube-api-access-2tjsj" (OuterVolumeSpecName: "kube-api-access-2tjsj") pod "821a550c-e6ff-4517-a306-11ea497be759" (UID: "821a550c-e6ff-4517-a306-11ea497be759"). InnerVolumeSpecName "kube-api-access-2tjsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:22:04 crc kubenswrapper[4792]: I0301 10:22:04.956459 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:22:04 crc kubenswrapper[4792]: I0301 10:22:04.956495 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:22:04 crc kubenswrapper[4792]: I0301 10:22:04.956533 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 10:22:04 crc kubenswrapper[4792]: I0301 10:22:04.957235 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"34d9173341b46ccf37e4f77b26afd17d6e6d0f7b2699af960d32ad54ab5e3db7"} pod="openshift-machine-config-operator/machine-config-daemon-bqszv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 01 10:22:04 crc kubenswrapper[4792]: I0301 10:22:04.957294 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" containerID="cri-o://34d9173341b46ccf37e4f77b26afd17d6e6d0f7b2699af960d32ad54ab5e3db7" gracePeriod=600 Mar 01 10:22:05 crc kubenswrapper[4792]: I0301 10:22:05.029232 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tjsj\" (UniqueName: \"kubernetes.io/projected/821a550c-e6ff-4517-a306-11ea497be759-kube-api-access-2tjsj\") on node \"crc\" DevicePath \"\"" Mar 01 10:22:05 crc kubenswrapper[4792]: I0301 10:22:05.256562 4792 generic.go:334] "Generic (PLEG): container finished" podID="9105f6b0-6f16-47aa-8009-73736a90b765" containerID="34d9173341b46ccf37e4f77b26afd17d6e6d0f7b2699af960d32ad54ab5e3db7" exitCode=0 Mar 01 10:22:05 crc kubenswrapper[4792]: I0301 10:22:05.256638 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerDied","Data":"34d9173341b46ccf37e4f77b26afd17d6e6d0f7b2699af960d32ad54ab5e3db7"} Mar 01 10:22:05 crc kubenswrapper[4792]: I0301 10:22:05.256968 4792 scope.go:117] "RemoveContainer" containerID="5559329cdd2d06c55138df9db1883f7a3d80cc77e580af010ae0eff1869520de" Mar 01 10:22:05 crc kubenswrapper[4792]: I0301 10:22:05.259447 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539342-5t24p" event={"ID":"821a550c-e6ff-4517-a306-11ea497be759","Type":"ContainerDied","Data":"f117d9ce2e607aa3abf587259dd00adb7c06fd9684234b0efa083d157e7c032e"} Mar 01 10:22:05 crc kubenswrapper[4792]: I0301 10:22:05.259478 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f117d9ce2e607aa3abf587259dd00adb7c06fd9684234b0efa083d157e7c032e" Mar 01 10:22:05 crc kubenswrapper[4792]: I0301 10:22:05.259530 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539342-5t24p" Mar 01 10:22:05 crc kubenswrapper[4792]: I0301 10:22:05.948167 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539336-vnhz7"] Mar 01 10:22:05 crc kubenswrapper[4792]: I0301 10:22:05.956693 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539336-vnhz7"] Mar 01 10:22:06 crc kubenswrapper[4792]: I0301 10:22:06.270180 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43"} Mar 01 10:22:07 crc kubenswrapper[4792]: I0301 10:22:07.420427 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="731500c7-53e0-431a-9ec7-7e56ef9c11ee" path="/var/lib/kubelet/pods/731500c7-53e0-431a-9ec7-7e56ef9c11ee/volumes" Mar 01 10:22:39 crc kubenswrapper[4792]: I0301 10:22:39.840763 4792 scope.go:117] "RemoveContainer" containerID="457400b5efe76b08cc35b2db4e3b43b3bb9dd95107e6fe2815b55395cf597f33" Mar 01 10:22:39 crc kubenswrapper[4792]: I0301 10:22:39.925039 4792 scope.go:117] "RemoveContainer" containerID="12a4ef8d3fa3b5f92a7e689e1d88b84e48e9aed058fb8fe4d0d9b00831ceeea9" Mar 01 10:23:32 crc kubenswrapper[4792]: I0301 10:23:32.020965 4792 generic.go:334] "Generic (PLEG): container finished" podID="c72d6020-9460-4198-863a-ec32bc90fee9" containerID="3040d0fb9fd765268819a187e562939355666f759f1f77a68a46d8ecca69408c" exitCode=0 Mar 01 10:23:32 crc kubenswrapper[4792]: I0301 10:23:32.021027 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8szkd/must-gather-5ntfl" event={"ID":"c72d6020-9460-4198-863a-ec32bc90fee9","Type":"ContainerDied","Data":"3040d0fb9fd765268819a187e562939355666f759f1f77a68a46d8ecca69408c"} Mar 01 10:23:32 crc kubenswrapper[4792]: I0301 10:23:32.022214 4792 scope.go:117] "RemoveContainer" containerID="3040d0fb9fd765268819a187e562939355666f759f1f77a68a46d8ecca69408c" Mar 01 10:23:32 crc kubenswrapper[4792]: I0301 10:23:32.908411 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8szkd_must-gather-5ntfl_c72d6020-9460-4198-863a-ec32bc90fee9/gather/0.log" Mar 01 10:23:41 crc kubenswrapper[4792]: I0301 10:23:41.691823 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8szkd/must-gather-5ntfl"] Mar 01 10:23:41 crc kubenswrapper[4792]: I0301 10:23:41.696541 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-8szkd/must-gather-5ntfl" podUID="c72d6020-9460-4198-863a-ec32bc90fee9" containerName="copy" containerID="cri-o://882ed7a11f9213b4a2466a64a60d624aeb96a25ba7c34fc56d839ecd11da0b1a" gracePeriod=2 Mar 01 10:23:41 crc kubenswrapper[4792]: I0301 10:23:41.725258 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8szkd/must-gather-5ntfl"] Mar 01 10:23:42 crc kubenswrapper[4792]: I0301 10:23:42.101122 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8szkd_must-gather-5ntfl_c72d6020-9460-4198-863a-ec32bc90fee9/copy/0.log" Mar 01 10:23:42 crc kubenswrapper[4792]: I0301 10:23:42.101860 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8szkd/must-gather-5ntfl" Mar 01 10:23:42 crc kubenswrapper[4792]: I0301 10:23:42.114997 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8szkd_must-gather-5ntfl_c72d6020-9460-4198-863a-ec32bc90fee9/copy/0.log" Mar 01 10:23:42 crc kubenswrapper[4792]: I0301 10:23:42.115642 4792 generic.go:334] "Generic (PLEG): container finished" podID="c72d6020-9460-4198-863a-ec32bc90fee9" containerID="882ed7a11f9213b4a2466a64a60d624aeb96a25ba7c34fc56d839ecd11da0b1a" exitCode=143 Mar 01 10:23:42 crc kubenswrapper[4792]: I0301 10:23:42.115698 4792 scope.go:117] "RemoveContainer" containerID="882ed7a11f9213b4a2466a64a60d624aeb96a25ba7c34fc56d839ecd11da0b1a" Mar 01 10:23:42 crc kubenswrapper[4792]: I0301 10:23:42.115724 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8szkd/must-gather-5ntfl" Mar 01 10:23:42 crc kubenswrapper[4792]: I0301 10:23:42.134543 4792 scope.go:117] "RemoveContainer" containerID="3040d0fb9fd765268819a187e562939355666f759f1f77a68a46d8ecca69408c" Mar 01 10:23:42 crc kubenswrapper[4792]: I0301 10:23:42.187672 4792 scope.go:117] "RemoveContainer" containerID="882ed7a11f9213b4a2466a64a60d624aeb96a25ba7c34fc56d839ecd11da0b1a" Mar 01 10:23:42 crc kubenswrapper[4792]: E0301 10:23:42.188380 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"882ed7a11f9213b4a2466a64a60d624aeb96a25ba7c34fc56d839ecd11da0b1a\": container with ID starting with 882ed7a11f9213b4a2466a64a60d624aeb96a25ba7c34fc56d839ecd11da0b1a not found: ID does not exist" containerID="882ed7a11f9213b4a2466a64a60d624aeb96a25ba7c34fc56d839ecd11da0b1a" Mar 01 10:23:42 crc kubenswrapper[4792]: I0301 10:23:42.188429 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"882ed7a11f9213b4a2466a64a60d624aeb96a25ba7c34fc56d839ecd11da0b1a"} err="failed to get container status \"882ed7a11f9213b4a2466a64a60d624aeb96a25ba7c34fc56d839ecd11da0b1a\": rpc error: code = NotFound desc = could not find container \"882ed7a11f9213b4a2466a64a60d624aeb96a25ba7c34fc56d839ecd11da0b1a\": container with ID starting with 882ed7a11f9213b4a2466a64a60d624aeb96a25ba7c34fc56d839ecd11da0b1a not found: ID does not exist" Mar 01 10:23:42 crc kubenswrapper[4792]: I0301 10:23:42.188465 4792 scope.go:117] "RemoveContainer" containerID="3040d0fb9fd765268819a187e562939355666f759f1f77a68a46d8ecca69408c" Mar 01 10:23:42 crc kubenswrapper[4792]: E0301 10:23:42.189057 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3040d0fb9fd765268819a187e562939355666f759f1f77a68a46d8ecca69408c\": container with ID starting with 3040d0fb9fd765268819a187e562939355666f759f1f77a68a46d8ecca69408c not found: ID does not exist" containerID="3040d0fb9fd765268819a187e562939355666f759f1f77a68a46d8ecca69408c" Mar 01 10:23:42 crc kubenswrapper[4792]: I0301 10:23:42.189179 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3040d0fb9fd765268819a187e562939355666f759f1f77a68a46d8ecca69408c"} err="failed to get container status \"3040d0fb9fd765268819a187e562939355666f759f1f77a68a46d8ecca69408c\": rpc error: code = NotFound desc = could not find container \"3040d0fb9fd765268819a187e562939355666f759f1f77a68a46d8ecca69408c\": container with ID starting with 3040d0fb9fd765268819a187e562939355666f759f1f77a68a46d8ecca69408c not found: ID does not exist" Mar 01 10:23:42 crc kubenswrapper[4792]: I0301 10:23:42.257405 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58lp7\" (UniqueName: \"kubernetes.io/projected/c72d6020-9460-4198-863a-ec32bc90fee9-kube-api-access-58lp7\") pod \"c72d6020-9460-4198-863a-ec32bc90fee9\" (UID: \"c72d6020-9460-4198-863a-ec32bc90fee9\") " Mar 01 10:23:42 crc kubenswrapper[4792]: I0301 10:23:42.257549 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c72d6020-9460-4198-863a-ec32bc90fee9-must-gather-output\") pod \"c72d6020-9460-4198-863a-ec32bc90fee9\" (UID: \"c72d6020-9460-4198-863a-ec32bc90fee9\") " Mar 01 10:23:42 crc kubenswrapper[4792]: I0301 10:23:42.262846 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c72d6020-9460-4198-863a-ec32bc90fee9-kube-api-access-58lp7" (OuterVolumeSpecName: "kube-api-access-58lp7") pod "c72d6020-9460-4198-863a-ec32bc90fee9" (UID: "c72d6020-9460-4198-863a-ec32bc90fee9"). InnerVolumeSpecName "kube-api-access-58lp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:23:42 crc kubenswrapper[4792]: I0301 10:23:42.359672 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58lp7\" (UniqueName: \"kubernetes.io/projected/c72d6020-9460-4198-863a-ec32bc90fee9-kube-api-access-58lp7\") on node \"crc\" DevicePath \"\"" Mar 01 10:23:42 crc kubenswrapper[4792]: I0301 10:23:42.433223 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c72d6020-9460-4198-863a-ec32bc90fee9-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "c72d6020-9460-4198-863a-ec32bc90fee9" (UID: "c72d6020-9460-4198-863a-ec32bc90fee9"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:23:42 crc kubenswrapper[4792]: I0301 10:23:42.462184 4792 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c72d6020-9460-4198-863a-ec32bc90fee9-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 01 10:23:43 crc kubenswrapper[4792]: I0301 10:23:43.422933 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c72d6020-9460-4198-863a-ec32bc90fee9" path="/var/lib/kubelet/pods/c72d6020-9460-4198-863a-ec32bc90fee9/volumes" Mar 01 10:24:00 crc kubenswrapper[4792]: I0301 10:24:00.141826 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539344-r7v49"] Mar 01 10:24:00 crc kubenswrapper[4792]: E0301 10:24:00.142863 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="821a550c-e6ff-4517-a306-11ea497be759" containerName="oc" Mar 01 10:24:00 crc kubenswrapper[4792]: I0301 10:24:00.142879 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="821a550c-e6ff-4517-a306-11ea497be759" containerName="oc" Mar 01 10:24:00 crc kubenswrapper[4792]: E0301 10:24:00.142898 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c72d6020-9460-4198-863a-ec32bc90fee9" containerName="copy" Mar 01 10:24:00 crc kubenswrapper[4792]: I0301 10:24:00.142922 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c72d6020-9460-4198-863a-ec32bc90fee9" containerName="copy" Mar 01 10:24:00 crc kubenswrapper[4792]: E0301 10:24:00.142937 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c72d6020-9460-4198-863a-ec32bc90fee9" containerName="gather" Mar 01 10:24:00 crc kubenswrapper[4792]: I0301 10:24:00.142944 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c72d6020-9460-4198-863a-ec32bc90fee9" containerName="gather" Mar 01 10:24:00 crc kubenswrapper[4792]: I0301 10:24:00.143209 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c72d6020-9460-4198-863a-ec32bc90fee9" containerName="gather" Mar 01 10:24:00 crc kubenswrapper[4792]: I0301 10:24:00.143227 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="821a550c-e6ff-4517-a306-11ea497be759" containerName="oc" Mar 01 10:24:00 crc kubenswrapper[4792]: I0301 10:24:00.143242 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c72d6020-9460-4198-863a-ec32bc90fee9" containerName="copy" Mar 01 10:24:00 crc kubenswrapper[4792]: I0301 10:24:00.144059 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539344-r7v49" Mar 01 10:24:00 crc kubenswrapper[4792]: I0301 10:24:00.148232 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:24:00 crc kubenswrapper[4792]: I0301 10:24:00.148398 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:24:00 crc kubenswrapper[4792]: I0301 10:24:00.150168 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:24:00 crc kubenswrapper[4792]: I0301 10:24:00.175267 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539344-r7v49"] Mar 01 10:24:00 crc kubenswrapper[4792]: I0301 10:24:00.296939 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xps49\" (UniqueName: \"kubernetes.io/projected/6c5eb940-780b-4b4d-ab60-e1ad0c284811-kube-api-access-xps49\") pod \"auto-csr-approver-29539344-r7v49\" (UID: \"6c5eb940-780b-4b4d-ab60-e1ad0c284811\") " pod="openshift-infra/auto-csr-approver-29539344-r7v49" Mar 01 10:24:00 crc kubenswrapper[4792]: I0301 10:24:00.398806 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xps49\" (UniqueName: \"kubernetes.io/projected/6c5eb940-780b-4b4d-ab60-e1ad0c284811-kube-api-access-xps49\") pod \"auto-csr-approver-29539344-r7v49\" (UID: \"6c5eb940-780b-4b4d-ab60-e1ad0c284811\") " pod="openshift-infra/auto-csr-approver-29539344-r7v49" Mar 01 10:24:00 crc kubenswrapper[4792]: I0301 10:24:00.428228 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xps49\" (UniqueName: \"kubernetes.io/projected/6c5eb940-780b-4b4d-ab60-e1ad0c284811-kube-api-access-xps49\") pod \"auto-csr-approver-29539344-r7v49\" (UID: \"6c5eb940-780b-4b4d-ab60-e1ad0c284811\") " pod="openshift-infra/auto-csr-approver-29539344-r7v49" Mar 01 10:24:00 crc kubenswrapper[4792]: I0301 10:24:00.465088 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539344-r7v49" Mar 01 10:24:00 crc kubenswrapper[4792]: I0301 10:24:00.903593 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539344-r7v49"] Mar 01 10:24:01 crc kubenswrapper[4792]: I0301 10:24:01.293605 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539344-r7v49" event={"ID":"6c5eb940-780b-4b4d-ab60-e1ad0c284811","Type":"ContainerStarted","Data":"e1a4c4d9ec6ade48f56616e194ac4074ba88b48e6166927718481562f18e30d6"} Mar 01 10:24:02 crc kubenswrapper[4792]: I0301 10:24:02.301823 4792 generic.go:334] "Generic (PLEG): container finished" podID="6c5eb940-780b-4b4d-ab60-e1ad0c284811" containerID="1f71a688006db007bbde2ad2f2afb131f6dc80a772403871242301338cd9bc3e" exitCode=0 Mar 01 10:24:02 crc kubenswrapper[4792]: I0301 10:24:02.301871 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539344-r7v49" event={"ID":"6c5eb940-780b-4b4d-ab60-e1ad0c284811","Type":"ContainerDied","Data":"1f71a688006db007bbde2ad2f2afb131f6dc80a772403871242301338cd9bc3e"} Mar 01 10:24:03 crc kubenswrapper[4792]: I0301 10:24:03.658263 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539344-r7v49" Mar 01 10:24:03 crc kubenswrapper[4792]: I0301 10:24:03.773684 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xps49\" (UniqueName: \"kubernetes.io/projected/6c5eb940-780b-4b4d-ab60-e1ad0c284811-kube-api-access-xps49\") pod \"6c5eb940-780b-4b4d-ab60-e1ad0c284811\" (UID: \"6c5eb940-780b-4b4d-ab60-e1ad0c284811\") " Mar 01 10:24:03 crc kubenswrapper[4792]: I0301 10:24:03.780666 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c5eb940-780b-4b4d-ab60-e1ad0c284811-kube-api-access-xps49" (OuterVolumeSpecName: "kube-api-access-xps49") pod "6c5eb940-780b-4b4d-ab60-e1ad0c284811" (UID: "6c5eb940-780b-4b4d-ab60-e1ad0c284811"). InnerVolumeSpecName "kube-api-access-xps49". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:24:03 crc kubenswrapper[4792]: I0301 10:24:03.876586 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xps49\" (UniqueName: \"kubernetes.io/projected/6c5eb940-780b-4b4d-ab60-e1ad0c284811-kube-api-access-xps49\") on node \"crc\" DevicePath \"\"" Mar 01 10:24:04 crc kubenswrapper[4792]: I0301 10:24:04.322769 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539344-r7v49" event={"ID":"6c5eb940-780b-4b4d-ab60-e1ad0c284811","Type":"ContainerDied","Data":"e1a4c4d9ec6ade48f56616e194ac4074ba88b48e6166927718481562f18e30d6"} Mar 01 10:24:04 crc kubenswrapper[4792]: I0301 10:24:04.322819 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539344-r7v49" Mar 01 10:24:04 crc kubenswrapper[4792]: I0301 10:24:04.322830 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1a4c4d9ec6ade48f56616e194ac4074ba88b48e6166927718481562f18e30d6" Mar 01 10:24:04 crc kubenswrapper[4792]: I0301 10:24:04.731324 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539338-kcgfx"] Mar 01 10:24:04 crc kubenswrapper[4792]: I0301 10:24:04.739356 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539338-kcgfx"] Mar 01 10:24:05 crc kubenswrapper[4792]: I0301 10:24:05.422428 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a36fce8b-7564-4d74-b3ad-9bfe9979cb67" path="/var/lib/kubelet/pods/a36fce8b-7564-4d74-b3ad-9bfe9979cb67/volumes" Mar 01 10:24:34 crc kubenswrapper[4792]: I0301 10:24:34.943215 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:24:34 crc kubenswrapper[4792]: I0301 10:24:34.943778 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:24:40 crc kubenswrapper[4792]: I0301 10:24:40.030564 4792 scope.go:117] "RemoveContainer" containerID="dd7d1b10f4227bfce662716c9cbd0e72c4c6de3ac564b41925cd39f0160eafdf" Mar 01 10:24:47 crc kubenswrapper[4792]: I0301 10:24:47.475807 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ws9zl"] Mar 01 10:24:47 crc kubenswrapper[4792]: E0301 10:24:47.478002 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c5eb940-780b-4b4d-ab60-e1ad0c284811" containerName="oc" Mar 01 10:24:47 crc kubenswrapper[4792]: I0301 10:24:47.478159 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c5eb940-780b-4b4d-ab60-e1ad0c284811" containerName="oc" Mar 01 10:24:47 crc kubenswrapper[4792]: I0301 10:24:47.478427 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c5eb940-780b-4b4d-ab60-e1ad0c284811" containerName="oc" Mar 01 10:24:47 crc kubenswrapper[4792]: I0301 10:24:47.479947 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ws9zl" Mar 01 10:24:47 crc kubenswrapper[4792]: I0301 10:24:47.511182 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ws9zl"] Mar 01 10:24:47 crc kubenswrapper[4792]: I0301 10:24:47.548063 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55npp\" (UniqueName: \"kubernetes.io/projected/556f4724-e050-4a4f-b7a3-680d9d7f08c5-kube-api-access-55npp\") pod \"certified-operators-ws9zl\" (UID: \"556f4724-e050-4a4f-b7a3-680d9d7f08c5\") " pod="openshift-marketplace/certified-operators-ws9zl" Mar 01 10:24:47 crc kubenswrapper[4792]: I0301 10:24:47.548156 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/556f4724-e050-4a4f-b7a3-680d9d7f08c5-utilities\") pod \"certified-operators-ws9zl\" (UID: \"556f4724-e050-4a4f-b7a3-680d9d7f08c5\") " pod="openshift-marketplace/certified-operators-ws9zl" Mar 01 10:24:47 crc kubenswrapper[4792]: I0301 10:24:47.548297 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/556f4724-e050-4a4f-b7a3-680d9d7f08c5-catalog-content\") pod \"certified-operators-ws9zl\" (UID: \"556f4724-e050-4a4f-b7a3-680d9d7f08c5\") " pod="openshift-marketplace/certified-operators-ws9zl" Mar 01 10:24:47 crc kubenswrapper[4792]: I0301 10:24:47.649743 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/556f4724-e050-4a4f-b7a3-680d9d7f08c5-catalog-content\") pod \"certified-operators-ws9zl\" (UID: \"556f4724-e050-4a4f-b7a3-680d9d7f08c5\") " pod="openshift-marketplace/certified-operators-ws9zl" Mar 01 10:24:47 crc kubenswrapper[4792]: I0301 10:24:47.649880 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55npp\" (UniqueName: \"kubernetes.io/projected/556f4724-e050-4a4f-b7a3-680d9d7f08c5-kube-api-access-55npp\") pod \"certified-operators-ws9zl\" (UID: \"556f4724-e050-4a4f-b7a3-680d9d7f08c5\") " pod="openshift-marketplace/certified-operators-ws9zl" Mar 01 10:24:47 crc kubenswrapper[4792]: I0301 10:24:47.649934 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/556f4724-e050-4a4f-b7a3-680d9d7f08c5-utilities\") pod \"certified-operators-ws9zl\" (UID: \"556f4724-e050-4a4f-b7a3-680d9d7f08c5\") " pod="openshift-marketplace/certified-operators-ws9zl" Mar 01 10:24:47 crc kubenswrapper[4792]: I0301 10:24:47.650347 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/556f4724-e050-4a4f-b7a3-680d9d7f08c5-catalog-content\") pod \"certified-operators-ws9zl\" (UID: \"556f4724-e050-4a4f-b7a3-680d9d7f08c5\") " pod="openshift-marketplace/certified-operators-ws9zl" Mar 01 10:24:47 crc kubenswrapper[4792]: I0301 10:24:47.650438 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/556f4724-e050-4a4f-b7a3-680d9d7f08c5-utilities\") pod \"certified-operators-ws9zl\" (UID: \"556f4724-e050-4a4f-b7a3-680d9d7f08c5\") " pod="openshift-marketplace/certified-operators-ws9zl" Mar 01 10:24:47 crc kubenswrapper[4792]: I0301 10:24:47.674785 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55npp\" (UniqueName: \"kubernetes.io/projected/556f4724-e050-4a4f-b7a3-680d9d7f08c5-kube-api-access-55npp\") pod \"certified-operators-ws9zl\" (UID: \"556f4724-e050-4a4f-b7a3-680d9d7f08c5\") " pod="openshift-marketplace/certified-operators-ws9zl" Mar 01 10:24:47 crc kubenswrapper[4792]: I0301 10:24:47.801205 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ws9zl" Mar 01 10:24:48 crc kubenswrapper[4792]: I0301 10:24:48.153918 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ws9zl"] Mar 01 10:24:48 crc kubenswrapper[4792]: I0301 10:24:48.851400 4792 generic.go:334] "Generic (PLEG): container finished" podID="556f4724-e050-4a4f-b7a3-680d9d7f08c5" containerID="0d7de9e7b3f536d936e56a7ed234d1bc1331199661c31959d06ba45c5299b92e" exitCode=0 Mar 01 10:24:48 crc kubenswrapper[4792]: I0301 10:24:48.851724 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ws9zl" event={"ID":"556f4724-e050-4a4f-b7a3-680d9d7f08c5","Type":"ContainerDied","Data":"0d7de9e7b3f536d936e56a7ed234d1bc1331199661c31959d06ba45c5299b92e"} Mar 01 10:24:48 crc kubenswrapper[4792]: I0301 10:24:48.851751 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ws9zl" event={"ID":"556f4724-e050-4a4f-b7a3-680d9d7f08c5","Type":"ContainerStarted","Data":"6dfcf844b73ec95251bb5d3c6c8335ba419bfbd6d26edbde18758b36514066bd"} Mar 01 10:24:49 crc kubenswrapper[4792]: I0301 10:24:49.860670 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ws9zl" event={"ID":"556f4724-e050-4a4f-b7a3-680d9d7f08c5","Type":"ContainerStarted","Data":"e4853fa909b6ae75d307a4cc489cd9628705342927483c4ad7be8c32cbd91c40"} Mar 01 10:24:51 crc kubenswrapper[4792]: I0301 10:24:51.879564 4792 generic.go:334] "Generic (PLEG): container finished" podID="556f4724-e050-4a4f-b7a3-680d9d7f08c5" containerID="e4853fa909b6ae75d307a4cc489cd9628705342927483c4ad7be8c32cbd91c40" exitCode=0 Mar 01 10:24:51 crc kubenswrapper[4792]: I0301 10:24:51.879649 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ws9zl" event={"ID":"556f4724-e050-4a4f-b7a3-680d9d7f08c5","Type":"ContainerDied","Data":"e4853fa909b6ae75d307a4cc489cd9628705342927483c4ad7be8c32cbd91c40"} Mar 01 10:24:52 crc kubenswrapper[4792]: I0301 10:24:52.276935 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h67d8"] Mar 01 10:24:52 crc kubenswrapper[4792]: I0301 10:24:52.282317 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h67d8" Mar 01 10:24:52 crc kubenswrapper[4792]: I0301 10:24:52.288005 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h67d8"] Mar 01 10:24:52 crc kubenswrapper[4792]: I0301 10:24:52.368881 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/582fa17a-143d-4bb0-a32e-a6d30f3e3754-utilities\") pod \"redhat-marketplace-h67d8\" (UID: \"582fa17a-143d-4bb0-a32e-a6d30f3e3754\") " pod="openshift-marketplace/redhat-marketplace-h67d8" Mar 01 10:24:52 crc kubenswrapper[4792]: I0301 10:24:52.368965 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcqwd\" (UniqueName: \"kubernetes.io/projected/582fa17a-143d-4bb0-a32e-a6d30f3e3754-kube-api-access-qcqwd\") pod \"redhat-marketplace-h67d8\" (UID: \"582fa17a-143d-4bb0-a32e-a6d30f3e3754\") " pod="openshift-marketplace/redhat-marketplace-h67d8" Mar 01 10:24:52 crc kubenswrapper[4792]: I0301 10:24:52.371365 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/582fa17a-143d-4bb0-a32e-a6d30f3e3754-catalog-content\") pod \"redhat-marketplace-h67d8\" (UID: \"582fa17a-143d-4bb0-a32e-a6d30f3e3754\") " pod="openshift-marketplace/redhat-marketplace-h67d8" Mar 01 10:24:52 crc kubenswrapper[4792]: I0301 10:24:52.473893 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/582fa17a-143d-4bb0-a32e-a6d30f3e3754-utilities\") pod \"redhat-marketplace-h67d8\" (UID: \"582fa17a-143d-4bb0-a32e-a6d30f3e3754\") " pod="openshift-marketplace/redhat-marketplace-h67d8" Mar 01 10:24:52 crc kubenswrapper[4792]: I0301 10:24:52.474242 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcqwd\" (UniqueName: \"kubernetes.io/projected/582fa17a-143d-4bb0-a32e-a6d30f3e3754-kube-api-access-qcqwd\") pod \"redhat-marketplace-h67d8\" (UID: \"582fa17a-143d-4bb0-a32e-a6d30f3e3754\") " pod="openshift-marketplace/redhat-marketplace-h67d8" Mar 01 10:24:52 crc kubenswrapper[4792]: I0301 10:24:52.474447 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/582fa17a-143d-4bb0-a32e-a6d30f3e3754-catalog-content\") pod \"redhat-marketplace-h67d8\" (UID: \"582fa17a-143d-4bb0-a32e-a6d30f3e3754\") " pod="openshift-marketplace/redhat-marketplace-h67d8" Mar 01 10:24:52 crc kubenswrapper[4792]: I0301 10:24:52.477367 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/582fa17a-143d-4bb0-a32e-a6d30f3e3754-utilities\") pod \"redhat-marketplace-h67d8\" (UID: \"582fa17a-143d-4bb0-a32e-a6d30f3e3754\") " pod="openshift-marketplace/redhat-marketplace-h67d8" Mar 01 10:24:52 crc kubenswrapper[4792]: I0301 10:24:52.478005 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/582fa17a-143d-4bb0-a32e-a6d30f3e3754-catalog-content\") pod \"redhat-marketplace-h67d8\" (UID: \"582fa17a-143d-4bb0-a32e-a6d30f3e3754\") " pod="openshift-marketplace/redhat-marketplace-h67d8" Mar 01 10:24:52 crc kubenswrapper[4792]: I0301 10:24:52.502788 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcqwd\" (UniqueName: \"kubernetes.io/projected/582fa17a-143d-4bb0-a32e-a6d30f3e3754-kube-api-access-qcqwd\") pod \"redhat-marketplace-h67d8\" (UID: \"582fa17a-143d-4bb0-a32e-a6d30f3e3754\") " pod="openshift-marketplace/redhat-marketplace-h67d8" Mar 01 10:24:52 crc kubenswrapper[4792]: I0301 10:24:52.600482 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h67d8" Mar 01 10:24:52 crc kubenswrapper[4792]: I0301 10:24:52.892072 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ws9zl" event={"ID":"556f4724-e050-4a4f-b7a3-680d9d7f08c5","Type":"ContainerStarted","Data":"861184e477f3e515517f157d27fb90a7b0c209ddadfbd732c4c22b992ba1c5af"} Mar 01 10:24:52 crc kubenswrapper[4792]: I0301 10:24:52.919384 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ws9zl" podStartSLOduration=2.500398993 podStartE2EDuration="5.919362643s" podCreationTimestamp="2026-03-01 10:24:47 +0000 UTC" firstStartedPulling="2026-03-01 10:24:48.853571282 +0000 UTC m=+4618.095450479" lastFinishedPulling="2026-03-01 10:24:52.272534932 +0000 UTC m=+4621.514414129" observedRunningTime="2026-03-01 10:24:52.91363953 +0000 UTC m=+4622.155518737" watchObservedRunningTime="2026-03-01 10:24:52.919362643 +0000 UTC m=+4622.161241840" Mar 01 10:24:53 crc kubenswrapper[4792]: W0301 10:24:53.071162 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod582fa17a_143d_4bb0_a32e_a6d30f3e3754.slice/crio-d04038240a388695601510cacf85197a316c819f1b4f9a710217ff1d52887310 WatchSource:0}: Error finding container d04038240a388695601510cacf85197a316c819f1b4f9a710217ff1d52887310: Status 404 returned error can't find the container with id d04038240a388695601510cacf85197a316c819f1b4f9a710217ff1d52887310 Mar 01 10:24:53 crc kubenswrapper[4792]: I0301 10:24:53.083176 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h67d8"] Mar 01 10:24:53 crc kubenswrapper[4792]: I0301 10:24:53.903027 4792 generic.go:334] "Generic (PLEG): container finished" podID="582fa17a-143d-4bb0-a32e-a6d30f3e3754" containerID="db712e4845a42511840eb3f9112057fd1851197bb2b005495491c047312918b1" exitCode=0 Mar 01 10:24:53 crc kubenswrapper[4792]: I0301 10:24:53.903329 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h67d8" event={"ID":"582fa17a-143d-4bb0-a32e-a6d30f3e3754","Type":"ContainerDied","Data":"db712e4845a42511840eb3f9112057fd1851197bb2b005495491c047312918b1"} Mar 01 10:24:53 crc kubenswrapper[4792]: I0301 10:24:53.903353 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h67d8" event={"ID":"582fa17a-143d-4bb0-a32e-a6d30f3e3754","Type":"ContainerStarted","Data":"d04038240a388695601510cacf85197a316c819f1b4f9a710217ff1d52887310"} Mar 01 10:24:54 crc kubenswrapper[4792]: I0301 10:24:54.915648 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h67d8" event={"ID":"582fa17a-143d-4bb0-a32e-a6d30f3e3754","Type":"ContainerStarted","Data":"a288e2f2cd47e6a54f7d84d346120c0d4ba093502cdfe3c581395af6e149a1b1"} Mar 01 10:24:56 crc kubenswrapper[4792]: I0301 10:24:56.936215 4792 generic.go:334] "Generic (PLEG): container finished" podID="582fa17a-143d-4bb0-a32e-a6d30f3e3754" containerID="a288e2f2cd47e6a54f7d84d346120c0d4ba093502cdfe3c581395af6e149a1b1" exitCode=0 Mar 01 10:24:56 crc kubenswrapper[4792]: I0301 10:24:56.936865 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h67d8" event={"ID":"582fa17a-143d-4bb0-a32e-a6d30f3e3754","Type":"ContainerDied","Data":"a288e2f2cd47e6a54f7d84d346120c0d4ba093502cdfe3c581395af6e149a1b1"} Mar 01 10:24:57 crc kubenswrapper[4792]: I0301 10:24:57.802203 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ws9zl" Mar 01 10:24:57 crc kubenswrapper[4792]: I0301 10:24:57.802695 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ws9zl" Mar 01 10:24:57 crc kubenswrapper[4792]: I0301 10:24:57.853895 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ws9zl" Mar 01 10:24:57 crc kubenswrapper[4792]: I0301 10:24:57.946625 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h67d8" event={"ID":"582fa17a-143d-4bb0-a32e-a6d30f3e3754","Type":"ContainerStarted","Data":"2e70dc111e2d37aee7ee8720da31db75627890586f0bb80926fbcccdee190bb4"} Mar 01 10:24:57 crc kubenswrapper[4792]: I0301 10:24:57.966100 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h67d8" podStartSLOduration=2.513508012 podStartE2EDuration="5.966074873s" podCreationTimestamp="2026-03-01 10:24:52 +0000 UTC" firstStartedPulling="2026-03-01 10:24:53.906202131 +0000 UTC m=+4623.148081328" lastFinishedPulling="2026-03-01 10:24:57.358768992 +0000 UTC m=+4626.600648189" observedRunningTime="2026-03-01 10:24:57.963066278 +0000 UTC m=+4627.204945515" watchObservedRunningTime="2026-03-01 10:24:57.966074873 +0000 UTC m=+4627.207954070" Mar 01 10:24:58 crc kubenswrapper[4792]: I0301 10:24:58.001407 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ws9zl" Mar 01 10:24:59 crc kubenswrapper[4792]: I0301 10:24:59.464066 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ws9zl"] Mar 01 10:24:59 crc kubenswrapper[4792]: I0301 10:24:59.965654 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ws9zl" podUID="556f4724-e050-4a4f-b7a3-680d9d7f08c5" containerName="registry-server" containerID="cri-o://861184e477f3e515517f157d27fb90a7b0c209ddadfbd732c4c22b992ba1c5af" gracePeriod=2 Mar 01 10:25:00 crc kubenswrapper[4792]: I0301 10:25:00.959131 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ws9zl" Mar 01 10:25:00 crc kubenswrapper[4792]: I0301 10:25:00.974648 4792 generic.go:334] "Generic (PLEG): container finished" podID="556f4724-e050-4a4f-b7a3-680d9d7f08c5" containerID="861184e477f3e515517f157d27fb90a7b0c209ddadfbd732c4c22b992ba1c5af" exitCode=0 Mar 01 10:25:00 crc kubenswrapper[4792]: I0301 10:25:00.974686 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ws9zl" event={"ID":"556f4724-e050-4a4f-b7a3-680d9d7f08c5","Type":"ContainerDied","Data":"861184e477f3e515517f157d27fb90a7b0c209ddadfbd732c4c22b992ba1c5af"} Mar 01 10:25:00 crc kubenswrapper[4792]: I0301 10:25:00.974724 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ws9zl" Mar 01 10:25:00 crc kubenswrapper[4792]: I0301 10:25:00.974742 4792 scope.go:117] "RemoveContainer" containerID="861184e477f3e515517f157d27fb90a7b0c209ddadfbd732c4c22b992ba1c5af" Mar 01 10:25:00 crc kubenswrapper[4792]: I0301 10:25:00.974731 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ws9zl" event={"ID":"556f4724-e050-4a4f-b7a3-680d9d7f08c5","Type":"ContainerDied","Data":"6dfcf844b73ec95251bb5d3c6c8335ba419bfbd6d26edbde18758b36514066bd"} Mar 01 10:25:00 crc kubenswrapper[4792]: I0301 10:25:00.991515 4792 scope.go:117] "RemoveContainer" containerID="e4853fa909b6ae75d307a4cc489cd9628705342927483c4ad7be8c32cbd91c40" Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.008266 4792 scope.go:117] "RemoveContainer" containerID="0d7de9e7b3f536d936e56a7ed234d1bc1331199661c31959d06ba45c5299b92e" Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.051400 4792 scope.go:117] "RemoveContainer" containerID="861184e477f3e515517f157d27fb90a7b0c209ddadfbd732c4c22b992ba1c5af" Mar 01 10:25:01 crc kubenswrapper[4792]: E0301 10:25:01.051844 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"861184e477f3e515517f157d27fb90a7b0c209ddadfbd732c4c22b992ba1c5af\": container with ID starting with 861184e477f3e515517f157d27fb90a7b0c209ddadfbd732c4c22b992ba1c5af not found: ID does not exist" containerID="861184e477f3e515517f157d27fb90a7b0c209ddadfbd732c4c22b992ba1c5af" Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.051878 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"861184e477f3e515517f157d27fb90a7b0c209ddadfbd732c4c22b992ba1c5af"} err="failed to get container status \"861184e477f3e515517f157d27fb90a7b0c209ddadfbd732c4c22b992ba1c5af\": rpc error: code = NotFound desc = could not find container \"861184e477f3e515517f157d27fb90a7b0c209ddadfbd732c4c22b992ba1c5af\": container with ID starting with 861184e477f3e515517f157d27fb90a7b0c209ddadfbd732c4c22b992ba1c5af not found: ID does not exist" Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.051960 4792 scope.go:117] "RemoveContainer" containerID="e4853fa909b6ae75d307a4cc489cd9628705342927483c4ad7be8c32cbd91c40" Mar 01 10:25:01 crc kubenswrapper[4792]: E0301 10:25:01.052266 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4853fa909b6ae75d307a4cc489cd9628705342927483c4ad7be8c32cbd91c40\": container with ID starting with e4853fa909b6ae75d307a4cc489cd9628705342927483c4ad7be8c32cbd91c40 not found: ID does not exist" containerID="e4853fa909b6ae75d307a4cc489cd9628705342927483c4ad7be8c32cbd91c40" Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.052291 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4853fa909b6ae75d307a4cc489cd9628705342927483c4ad7be8c32cbd91c40"} err="failed to get container status \"e4853fa909b6ae75d307a4cc489cd9628705342927483c4ad7be8c32cbd91c40\": rpc error: code = NotFound desc = could not find container \"e4853fa909b6ae75d307a4cc489cd9628705342927483c4ad7be8c32cbd91c40\": container with ID starting with e4853fa909b6ae75d307a4cc489cd9628705342927483c4ad7be8c32cbd91c40 not found: ID does not exist" Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.052314 4792 scope.go:117] "RemoveContainer" containerID="0d7de9e7b3f536d936e56a7ed234d1bc1331199661c31959d06ba45c5299b92e" Mar 01 10:25:01 crc kubenswrapper[4792]: E0301 10:25:01.052778 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d7de9e7b3f536d936e56a7ed234d1bc1331199661c31959d06ba45c5299b92e\": container with ID starting with 0d7de9e7b3f536d936e56a7ed234d1bc1331199661c31959d06ba45c5299b92e not found: ID does not exist" containerID="0d7de9e7b3f536d936e56a7ed234d1bc1331199661c31959d06ba45c5299b92e" Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.052860 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d7de9e7b3f536d936e56a7ed234d1bc1331199661c31959d06ba45c5299b92e"} err="failed to get container status \"0d7de9e7b3f536d936e56a7ed234d1bc1331199661c31959d06ba45c5299b92e\": rpc error: code = NotFound desc = could not find container \"0d7de9e7b3f536d936e56a7ed234d1bc1331199661c31959d06ba45c5299b92e\": container with ID starting with 0d7de9e7b3f536d936e56a7ed234d1bc1331199661c31959d06ba45c5299b92e not found: ID does not exist" Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.071601 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/556f4724-e050-4a4f-b7a3-680d9d7f08c5-utilities\") pod \"556f4724-e050-4a4f-b7a3-680d9d7f08c5\" (UID: \"556f4724-e050-4a4f-b7a3-680d9d7f08c5\") " Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.071768 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55npp\" (UniqueName: \"kubernetes.io/projected/556f4724-e050-4a4f-b7a3-680d9d7f08c5-kube-api-access-55npp\") pod \"556f4724-e050-4a4f-b7a3-680d9d7f08c5\" (UID: \"556f4724-e050-4a4f-b7a3-680d9d7f08c5\") " Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.071849 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/556f4724-e050-4a4f-b7a3-680d9d7f08c5-catalog-content\") pod \"556f4724-e050-4a4f-b7a3-680d9d7f08c5\" (UID: \"556f4724-e050-4a4f-b7a3-680d9d7f08c5\") " Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.073098 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/556f4724-e050-4a4f-b7a3-680d9d7f08c5-utilities" (OuterVolumeSpecName: "utilities") pod "556f4724-e050-4a4f-b7a3-680d9d7f08c5" (UID: "556f4724-e050-4a4f-b7a3-680d9d7f08c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.076949 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/556f4724-e050-4a4f-b7a3-680d9d7f08c5-kube-api-access-55npp" (OuterVolumeSpecName: "kube-api-access-55npp") pod "556f4724-e050-4a4f-b7a3-680d9d7f08c5" (UID: "556f4724-e050-4a4f-b7a3-680d9d7f08c5"). InnerVolumeSpecName "kube-api-access-55npp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.121421 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/556f4724-e050-4a4f-b7a3-680d9d7f08c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "556f4724-e050-4a4f-b7a3-680d9d7f08c5" (UID: "556f4724-e050-4a4f-b7a3-680d9d7f08c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.174480 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55npp\" (UniqueName: \"kubernetes.io/projected/556f4724-e050-4a4f-b7a3-680d9d7f08c5-kube-api-access-55npp\") on node \"crc\" DevicePath \"\"" Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.174528 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/556f4724-e050-4a4f-b7a3-680d9d7f08c5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.174541 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/556f4724-e050-4a4f-b7a3-680d9d7f08c5-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.312671 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ws9zl"] Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.325181 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ws9zl"] Mar 01 10:25:01 crc kubenswrapper[4792]: I0301 10:25:01.419555 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="556f4724-e050-4a4f-b7a3-680d9d7f08c5" path="/var/lib/kubelet/pods/556f4724-e050-4a4f-b7a3-680d9d7f08c5/volumes" Mar 01 10:25:02 crc kubenswrapper[4792]: I0301 10:25:02.602043 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h67d8" Mar 01 10:25:02 crc kubenswrapper[4792]: I0301 10:25:02.602440 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h67d8" Mar 01 10:25:02 crc kubenswrapper[4792]: I0301 10:25:02.682084 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h67d8" Mar 01 10:25:03 crc kubenswrapper[4792]: I0301 10:25:03.040948 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h67d8" Mar 01 10:25:03 crc kubenswrapper[4792]: I0301 10:25:03.663257 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h67d8"] Mar 01 10:25:04 crc kubenswrapper[4792]: I0301 10:25:04.942969 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:25:04 crc kubenswrapper[4792]: I0301 10:25:04.943318 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:25:05 crc kubenswrapper[4792]: I0301 10:25:05.014334 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h67d8" podUID="582fa17a-143d-4bb0-a32e-a6d30f3e3754" containerName="registry-server" containerID="cri-o://2e70dc111e2d37aee7ee8720da31db75627890586f0bb80926fbcccdee190bb4" gracePeriod=2 Mar 01 10:25:05 crc kubenswrapper[4792]: I0301 10:25:05.555740 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h67d8" Mar 01 10:25:05 crc kubenswrapper[4792]: I0301 10:25:05.683944 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/582fa17a-143d-4bb0-a32e-a6d30f3e3754-utilities\") pod \"582fa17a-143d-4bb0-a32e-a6d30f3e3754\" (UID: \"582fa17a-143d-4bb0-a32e-a6d30f3e3754\") " Mar 01 10:25:05 crc kubenswrapper[4792]: I0301 10:25:05.684056 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcqwd\" (UniqueName: \"kubernetes.io/projected/582fa17a-143d-4bb0-a32e-a6d30f3e3754-kube-api-access-qcqwd\") pod \"582fa17a-143d-4bb0-a32e-a6d30f3e3754\" (UID: \"582fa17a-143d-4bb0-a32e-a6d30f3e3754\") " Mar 01 10:25:05 crc kubenswrapper[4792]: I0301 10:25:05.684234 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/582fa17a-143d-4bb0-a32e-a6d30f3e3754-catalog-content\") pod \"582fa17a-143d-4bb0-a32e-a6d30f3e3754\" (UID: \"582fa17a-143d-4bb0-a32e-a6d30f3e3754\") " Mar 01 10:25:05 crc kubenswrapper[4792]: I0301 10:25:05.684757 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/582fa17a-143d-4bb0-a32e-a6d30f3e3754-utilities" (OuterVolumeSpecName: "utilities") pod "582fa17a-143d-4bb0-a32e-a6d30f3e3754" (UID: "582fa17a-143d-4bb0-a32e-a6d30f3e3754"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:25:05 crc kubenswrapper[4792]: I0301 10:25:05.691363 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/582fa17a-143d-4bb0-a32e-a6d30f3e3754-kube-api-access-qcqwd" (OuterVolumeSpecName: "kube-api-access-qcqwd") pod "582fa17a-143d-4bb0-a32e-a6d30f3e3754" (UID: "582fa17a-143d-4bb0-a32e-a6d30f3e3754"). InnerVolumeSpecName "kube-api-access-qcqwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:25:05 crc kubenswrapper[4792]: I0301 10:25:05.712449 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/582fa17a-143d-4bb0-a32e-a6d30f3e3754-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "582fa17a-143d-4bb0-a32e-a6d30f3e3754" (UID: "582fa17a-143d-4bb0-a32e-a6d30f3e3754"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:25:05 crc kubenswrapper[4792]: I0301 10:25:05.787278 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/582fa17a-143d-4bb0-a32e-a6d30f3e3754-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 10:25:05 crc kubenswrapper[4792]: I0301 10:25:05.787316 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcqwd\" (UniqueName: \"kubernetes.io/projected/582fa17a-143d-4bb0-a32e-a6d30f3e3754-kube-api-access-qcqwd\") on node \"crc\" DevicePath \"\"" Mar 01 10:25:05 crc kubenswrapper[4792]: I0301 10:25:05.787329 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/582fa17a-143d-4bb0-a32e-a6d30f3e3754-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 10:25:06 crc kubenswrapper[4792]: I0301 10:25:06.026454 4792 generic.go:334] "Generic (PLEG): container finished" podID="582fa17a-143d-4bb0-a32e-a6d30f3e3754" containerID="2e70dc111e2d37aee7ee8720da31db75627890586f0bb80926fbcccdee190bb4" exitCode=0 Mar 01 10:25:06 crc kubenswrapper[4792]: I0301 10:25:06.026530 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h67d8" event={"ID":"582fa17a-143d-4bb0-a32e-a6d30f3e3754","Type":"ContainerDied","Data":"2e70dc111e2d37aee7ee8720da31db75627890586f0bb80926fbcccdee190bb4"} Mar 01 10:25:06 crc kubenswrapper[4792]: I0301 10:25:06.026565 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h67d8" Mar 01 10:25:06 crc kubenswrapper[4792]: I0301 10:25:06.026757 4792 scope.go:117] "RemoveContainer" containerID="2e70dc111e2d37aee7ee8720da31db75627890586f0bb80926fbcccdee190bb4" Mar 01 10:25:06 crc kubenswrapper[4792]: I0301 10:25:06.026739 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h67d8" event={"ID":"582fa17a-143d-4bb0-a32e-a6d30f3e3754","Type":"ContainerDied","Data":"d04038240a388695601510cacf85197a316c819f1b4f9a710217ff1d52887310"} Mar 01 10:25:06 crc kubenswrapper[4792]: I0301 10:25:06.047149 4792 scope.go:117] "RemoveContainer" containerID="a288e2f2cd47e6a54f7d84d346120c0d4ba093502cdfe3c581395af6e149a1b1" Mar 01 10:25:06 crc kubenswrapper[4792]: I0301 10:25:06.071787 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h67d8"] Mar 01 10:25:06 crc kubenswrapper[4792]: I0301 10:25:06.087511 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h67d8"] Mar 01 10:25:06 crc kubenswrapper[4792]: I0301 10:25:06.096181 4792 scope.go:117] "RemoveContainer" containerID="db712e4845a42511840eb3f9112057fd1851197bb2b005495491c047312918b1" Mar 01 10:25:06 crc kubenswrapper[4792]: I0301 10:25:06.148659 4792 scope.go:117] "RemoveContainer" containerID="2e70dc111e2d37aee7ee8720da31db75627890586f0bb80926fbcccdee190bb4" Mar 01 10:25:06 crc kubenswrapper[4792]: E0301 10:25:06.149181 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e70dc111e2d37aee7ee8720da31db75627890586f0bb80926fbcccdee190bb4\": container with ID starting with 2e70dc111e2d37aee7ee8720da31db75627890586f0bb80926fbcccdee190bb4 not found: ID does not exist" containerID="2e70dc111e2d37aee7ee8720da31db75627890586f0bb80926fbcccdee190bb4" Mar 01 10:25:06 crc kubenswrapper[4792]: I0301 10:25:06.149217 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e70dc111e2d37aee7ee8720da31db75627890586f0bb80926fbcccdee190bb4"} err="failed to get container status \"2e70dc111e2d37aee7ee8720da31db75627890586f0bb80926fbcccdee190bb4\": rpc error: code = NotFound desc = could not find container \"2e70dc111e2d37aee7ee8720da31db75627890586f0bb80926fbcccdee190bb4\": container with ID starting with 2e70dc111e2d37aee7ee8720da31db75627890586f0bb80926fbcccdee190bb4 not found: ID does not exist" Mar 01 10:25:06 crc kubenswrapper[4792]: I0301 10:25:06.149241 4792 scope.go:117] "RemoveContainer" containerID="a288e2f2cd47e6a54f7d84d346120c0d4ba093502cdfe3c581395af6e149a1b1" Mar 01 10:25:06 crc kubenswrapper[4792]: E0301 10:25:06.150009 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a288e2f2cd47e6a54f7d84d346120c0d4ba093502cdfe3c581395af6e149a1b1\": container with ID starting with a288e2f2cd47e6a54f7d84d346120c0d4ba093502cdfe3c581395af6e149a1b1 not found: ID does not exist" containerID="a288e2f2cd47e6a54f7d84d346120c0d4ba093502cdfe3c581395af6e149a1b1" Mar 01 10:25:06 crc kubenswrapper[4792]: I0301 10:25:06.150040 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a288e2f2cd47e6a54f7d84d346120c0d4ba093502cdfe3c581395af6e149a1b1"} err="failed to get container status \"a288e2f2cd47e6a54f7d84d346120c0d4ba093502cdfe3c581395af6e149a1b1\": rpc error: code = NotFound desc = could not find container \"a288e2f2cd47e6a54f7d84d346120c0d4ba093502cdfe3c581395af6e149a1b1\": container with ID starting with a288e2f2cd47e6a54f7d84d346120c0d4ba093502cdfe3c581395af6e149a1b1 not found: ID does not exist" Mar 01 10:25:06 crc kubenswrapper[4792]: I0301 10:25:06.150061 4792 scope.go:117] "RemoveContainer" containerID="db712e4845a42511840eb3f9112057fd1851197bb2b005495491c047312918b1" Mar 01 10:25:06 crc kubenswrapper[4792]: E0301 10:25:06.150470 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db712e4845a42511840eb3f9112057fd1851197bb2b005495491c047312918b1\": container with ID starting with db712e4845a42511840eb3f9112057fd1851197bb2b005495491c047312918b1 not found: ID does not exist" containerID="db712e4845a42511840eb3f9112057fd1851197bb2b005495491c047312918b1" Mar 01 10:25:06 crc kubenswrapper[4792]: I0301 10:25:06.150497 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db712e4845a42511840eb3f9112057fd1851197bb2b005495491c047312918b1"} err="failed to get container status \"db712e4845a42511840eb3f9112057fd1851197bb2b005495491c047312918b1\": rpc error: code = NotFound desc = could not find container \"db712e4845a42511840eb3f9112057fd1851197bb2b005495491c047312918b1\": container with ID starting with db712e4845a42511840eb3f9112057fd1851197bb2b005495491c047312918b1 not found: ID does not exist" Mar 01 10:25:07 crc kubenswrapper[4792]: I0301 10:25:07.422540 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="582fa17a-143d-4bb0-a32e-a6d30f3e3754" path="/var/lib/kubelet/pods/582fa17a-143d-4bb0-a32e-a6d30f3e3754/volumes" Mar 01 10:25:34 crc kubenswrapper[4792]: I0301 10:25:34.943405 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:25:34 crc kubenswrapper[4792]: I0301 10:25:34.943864 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:25:34 crc kubenswrapper[4792]: I0301 10:25:34.943899 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 10:25:34 crc kubenswrapper[4792]: I0301 10:25:34.944628 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43"} pod="openshift-machine-config-operator/machine-config-daemon-bqszv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 01 10:25:34 crc kubenswrapper[4792]: I0301 10:25:34.944676 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" containerID="cri-o://6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" gracePeriod=600 Mar 01 10:25:35 crc kubenswrapper[4792]: E0301 10:25:35.062959 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:25:35 crc kubenswrapper[4792]: I0301 10:25:35.266553 4792 generic.go:334] "Generic (PLEG): container finished" podID="9105f6b0-6f16-47aa-8009-73736a90b765" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" exitCode=0 Mar 01 10:25:35 crc kubenswrapper[4792]: I0301 10:25:35.266604 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerDied","Data":"6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43"} Mar 01 10:25:35 crc kubenswrapper[4792]: I0301 10:25:35.266642 4792 scope.go:117] "RemoveContainer" containerID="34d9173341b46ccf37e4f77b26afd17d6e6d0f7b2699af960d32ad54ab5e3db7" Mar 01 10:25:35 crc kubenswrapper[4792]: I0301 10:25:35.267339 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:25:35 crc kubenswrapper[4792]: E0301 10:25:35.267655 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:25:47 crc kubenswrapper[4792]: I0301 10:25:47.410078 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:25:47 crc kubenswrapper[4792]: E0301 10:25:47.412106 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:25:59 crc kubenswrapper[4792]: I0301 10:25:59.408948 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:25:59 crc kubenswrapper[4792]: E0301 10:25:59.411441 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.155230 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539346-l8ktk"] Mar 01 10:26:00 crc kubenswrapper[4792]: E0301 10:26:00.156589 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="556f4724-e050-4a4f-b7a3-680d9d7f08c5" containerName="extract-utilities" Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.156614 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="556f4724-e050-4a4f-b7a3-680d9d7f08c5" containerName="extract-utilities" Mar 01 10:26:00 crc kubenswrapper[4792]: E0301 10:26:00.156633 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="582fa17a-143d-4bb0-a32e-a6d30f3e3754" containerName="extract-utilities" Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.156642 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="582fa17a-143d-4bb0-a32e-a6d30f3e3754" containerName="extract-utilities" Mar 01 10:26:00 crc kubenswrapper[4792]: E0301 10:26:00.156659 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="556f4724-e050-4a4f-b7a3-680d9d7f08c5" containerName="registry-server" Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.156669 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="556f4724-e050-4a4f-b7a3-680d9d7f08c5" containerName="registry-server" Mar 01 10:26:00 crc kubenswrapper[4792]: E0301 10:26:00.156684 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="556f4724-e050-4a4f-b7a3-680d9d7f08c5" containerName="extract-content" Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.156692 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="556f4724-e050-4a4f-b7a3-680d9d7f08c5" containerName="extract-content" Mar 01 10:26:00 crc kubenswrapper[4792]: E0301 10:26:00.156713 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="582fa17a-143d-4bb0-a32e-a6d30f3e3754" containerName="registry-server" Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.156722 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="582fa17a-143d-4bb0-a32e-a6d30f3e3754" containerName="registry-server" Mar 01 10:26:00 crc kubenswrapper[4792]: E0301 10:26:00.156745 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="582fa17a-143d-4bb0-a32e-a6d30f3e3754" containerName="extract-content" Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.156754 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="582fa17a-143d-4bb0-a32e-a6d30f3e3754" containerName="extract-content" Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.156990 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="556f4724-e050-4a4f-b7a3-680d9d7f08c5" containerName="registry-server" Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.157031 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="582fa17a-143d-4bb0-a32e-a6d30f3e3754" containerName="registry-server" Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.157900 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539346-l8ktk" Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.160489 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.161782 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.163731 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.163747 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539346-l8ktk"] Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.347069 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqhq2\" (UniqueName: \"kubernetes.io/projected/9c23503b-d97f-4cef-b792-e7fbdd8934ab-kube-api-access-dqhq2\") pod \"auto-csr-approver-29539346-l8ktk\" (UID: \"9c23503b-d97f-4cef-b792-e7fbdd8934ab\") " pod="openshift-infra/auto-csr-approver-29539346-l8ktk" Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.449598 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqhq2\" (UniqueName: \"kubernetes.io/projected/9c23503b-d97f-4cef-b792-e7fbdd8934ab-kube-api-access-dqhq2\") pod \"auto-csr-approver-29539346-l8ktk\" (UID: \"9c23503b-d97f-4cef-b792-e7fbdd8934ab\") " pod="openshift-infra/auto-csr-approver-29539346-l8ktk" Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.473650 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqhq2\" (UniqueName: \"kubernetes.io/projected/9c23503b-d97f-4cef-b792-e7fbdd8934ab-kube-api-access-dqhq2\") pod \"auto-csr-approver-29539346-l8ktk\" (UID: \"9c23503b-d97f-4cef-b792-e7fbdd8934ab\") " pod="openshift-infra/auto-csr-approver-29539346-l8ktk" Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.477136 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539346-l8ktk" Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.921343 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539346-l8ktk"] Mar 01 10:26:00 crc kubenswrapper[4792]: I0301 10:26:00.926625 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 01 10:26:01 crc kubenswrapper[4792]: I0301 10:26:01.555932 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539346-l8ktk" event={"ID":"9c23503b-d97f-4cef-b792-e7fbdd8934ab","Type":"ContainerStarted","Data":"c98fd7e3154616a93d3b7d0af38bc303e4e5ccabeefc5244bdc91aee0c92792e"} Mar 01 10:26:02 crc kubenswrapper[4792]: I0301 10:26:02.565086 4792 generic.go:334] "Generic (PLEG): container finished" podID="9c23503b-d97f-4cef-b792-e7fbdd8934ab" containerID="f2c5b6ef4792c5289a47b67d38a5bcb076b8c29acc57acf629d74f960223e4cf" exitCode=0 Mar 01 10:26:02 crc kubenswrapper[4792]: I0301 10:26:02.565152 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539346-l8ktk" event={"ID":"9c23503b-d97f-4cef-b792-e7fbdd8934ab","Type":"ContainerDied","Data":"f2c5b6ef4792c5289a47b67d38a5bcb076b8c29acc57acf629d74f960223e4cf"} Mar 01 10:26:03 crc kubenswrapper[4792]: I0301 10:26:03.937580 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539346-l8ktk" Mar 01 10:26:04 crc kubenswrapper[4792]: I0301 10:26:04.026445 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqhq2\" (UniqueName: \"kubernetes.io/projected/9c23503b-d97f-4cef-b792-e7fbdd8934ab-kube-api-access-dqhq2\") pod \"9c23503b-d97f-4cef-b792-e7fbdd8934ab\" (UID: \"9c23503b-d97f-4cef-b792-e7fbdd8934ab\") " Mar 01 10:26:04 crc kubenswrapper[4792]: I0301 10:26:04.033745 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c23503b-d97f-4cef-b792-e7fbdd8934ab-kube-api-access-dqhq2" (OuterVolumeSpecName: "kube-api-access-dqhq2") pod "9c23503b-d97f-4cef-b792-e7fbdd8934ab" (UID: "9c23503b-d97f-4cef-b792-e7fbdd8934ab"). InnerVolumeSpecName "kube-api-access-dqhq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:26:04 crc kubenswrapper[4792]: I0301 10:26:04.129374 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqhq2\" (UniqueName: \"kubernetes.io/projected/9c23503b-d97f-4cef-b792-e7fbdd8934ab-kube-api-access-dqhq2\") on node \"crc\" DevicePath \"\"" Mar 01 10:26:04 crc kubenswrapper[4792]: I0301 10:26:04.587311 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539346-l8ktk" event={"ID":"9c23503b-d97f-4cef-b792-e7fbdd8934ab","Type":"ContainerDied","Data":"c98fd7e3154616a93d3b7d0af38bc303e4e5ccabeefc5244bdc91aee0c92792e"} Mar 01 10:26:04 crc kubenswrapper[4792]: I0301 10:26:04.587348 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c98fd7e3154616a93d3b7d0af38bc303e4e5ccabeefc5244bdc91aee0c92792e" Mar 01 10:26:04 crc kubenswrapper[4792]: I0301 10:26:04.587372 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539346-l8ktk" Mar 01 10:26:05 crc kubenswrapper[4792]: I0301 10:26:05.020871 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539340-rjnp9"] Mar 01 10:26:05 crc kubenswrapper[4792]: I0301 10:26:05.033924 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539340-rjnp9"] Mar 01 10:26:05 crc kubenswrapper[4792]: I0301 10:26:05.424504 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d79ac35-053d-480b-a8ef-3b03122b0152" path="/var/lib/kubelet/pods/5d79ac35-053d-480b-a8ef-3b03122b0152/volumes" Mar 01 10:26:14 crc kubenswrapper[4792]: I0301 10:26:14.409465 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:26:14 crc kubenswrapper[4792]: E0301 10:26:14.410231 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:26:26 crc kubenswrapper[4792]: I0301 10:26:26.408868 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:26:26 crc kubenswrapper[4792]: E0301 10:26:26.409708 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:26:40 crc kubenswrapper[4792]: I0301 10:26:40.156099 4792 scope.go:117] "RemoveContainer" containerID="fd47f4e8c316ceef6350038b81becae73fa982c181d0ebd8621bb407bf6fb4b7" Mar 01 10:26:41 crc kubenswrapper[4792]: I0301 10:26:41.416523 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:26:41 crc kubenswrapper[4792]: E0301 10:26:41.417366 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:26:46 crc kubenswrapper[4792]: I0301 10:26:46.538201 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gqc89/must-gather-vdffg"] Mar 01 10:26:46 crc kubenswrapper[4792]: E0301 10:26:46.539089 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c23503b-d97f-4cef-b792-e7fbdd8934ab" containerName="oc" Mar 01 10:26:46 crc kubenswrapper[4792]: I0301 10:26:46.539105 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c23503b-d97f-4cef-b792-e7fbdd8934ab" containerName="oc" Mar 01 10:26:46 crc kubenswrapper[4792]: I0301 10:26:46.539329 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c23503b-d97f-4cef-b792-e7fbdd8934ab" containerName="oc" Mar 01 10:26:46 crc kubenswrapper[4792]: I0301 10:26:46.540521 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gqc89/must-gather-vdffg" Mar 01 10:26:46 crc kubenswrapper[4792]: I0301 10:26:46.548367 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-gqc89"/"openshift-service-ca.crt" Mar 01 10:26:46 crc kubenswrapper[4792]: I0301 10:26:46.549604 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-gqc89"/"kube-root-ca.crt" Mar 01 10:26:46 crc kubenswrapper[4792]: I0301 10:26:46.576758 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gqc89/must-gather-vdffg"] Mar 01 10:26:46 crc kubenswrapper[4792]: I0301 10:26:46.669896 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcq2x\" (UniqueName: \"kubernetes.io/projected/f3c98b67-7926-411d-9068-0b7991b0551b-kube-api-access-mcq2x\") pod \"must-gather-vdffg\" (UID: \"f3c98b67-7926-411d-9068-0b7991b0551b\") " pod="openshift-must-gather-gqc89/must-gather-vdffg" Mar 01 10:26:46 crc kubenswrapper[4792]: I0301 10:26:46.670109 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f3c98b67-7926-411d-9068-0b7991b0551b-must-gather-output\") pod \"must-gather-vdffg\" (UID: \"f3c98b67-7926-411d-9068-0b7991b0551b\") " pod="openshift-must-gather-gqc89/must-gather-vdffg" Mar 01 10:26:46 crc kubenswrapper[4792]: I0301 10:26:46.772266 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcq2x\" (UniqueName: \"kubernetes.io/projected/f3c98b67-7926-411d-9068-0b7991b0551b-kube-api-access-mcq2x\") pod \"must-gather-vdffg\" (UID: \"f3c98b67-7926-411d-9068-0b7991b0551b\") " pod="openshift-must-gather-gqc89/must-gather-vdffg" Mar 01 10:26:46 crc kubenswrapper[4792]: I0301 10:26:46.772358 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f3c98b67-7926-411d-9068-0b7991b0551b-must-gather-output\") pod \"must-gather-vdffg\" (UID: \"f3c98b67-7926-411d-9068-0b7991b0551b\") " pod="openshift-must-gather-gqc89/must-gather-vdffg" Mar 01 10:26:46 crc kubenswrapper[4792]: I0301 10:26:46.772843 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f3c98b67-7926-411d-9068-0b7991b0551b-must-gather-output\") pod \"must-gather-vdffg\" (UID: \"f3c98b67-7926-411d-9068-0b7991b0551b\") " pod="openshift-must-gather-gqc89/must-gather-vdffg" Mar 01 10:26:46 crc kubenswrapper[4792]: I0301 10:26:46.792178 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcq2x\" (UniqueName: \"kubernetes.io/projected/f3c98b67-7926-411d-9068-0b7991b0551b-kube-api-access-mcq2x\") pod \"must-gather-vdffg\" (UID: \"f3c98b67-7926-411d-9068-0b7991b0551b\") " pod="openshift-must-gather-gqc89/must-gather-vdffg" Mar 01 10:26:46 crc kubenswrapper[4792]: I0301 10:26:46.867292 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gqc89/must-gather-vdffg" Mar 01 10:26:47 crc kubenswrapper[4792]: I0301 10:26:47.370864 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gqc89/must-gather-vdffg"] Mar 01 10:26:48 crc kubenswrapper[4792]: I0301 10:26:48.032338 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gqc89/must-gather-vdffg" event={"ID":"f3c98b67-7926-411d-9068-0b7991b0551b","Type":"ContainerStarted","Data":"08ddb8424f25ab955f23fb248fb3199c701786d1289df5f9635fe0b5ba6df5a8"} Mar 01 10:26:48 crc kubenswrapper[4792]: I0301 10:26:48.033292 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gqc89/must-gather-vdffg" event={"ID":"f3c98b67-7926-411d-9068-0b7991b0551b","Type":"ContainerStarted","Data":"1565ec9980556fdbf01e99188e64fa14f5e2ec76dbcd766030cf211adcc3fc4e"} Mar 01 10:26:49 crc kubenswrapper[4792]: I0301 10:26:49.042630 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gqc89/must-gather-vdffg" event={"ID":"f3c98b67-7926-411d-9068-0b7991b0551b","Type":"ContainerStarted","Data":"3a427244043269002de52e0a079a3760bf74195c9fc73cf6abf599f3980c7560"} Mar 01 10:26:49 crc kubenswrapper[4792]: I0301 10:26:49.066493 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gqc89/must-gather-vdffg" podStartSLOduration=3.066476008 podStartE2EDuration="3.066476008s" podCreationTimestamp="2026-03-01 10:26:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 10:26:49.059476612 +0000 UTC m=+4738.301355809" watchObservedRunningTime="2026-03-01 10:26:49.066476008 +0000 UTC m=+4738.308355205" Mar 01 10:26:52 crc kubenswrapper[4792]: I0301 10:26:52.479581 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gqc89/crc-debug-tb4mv"] Mar 01 10:26:52 crc kubenswrapper[4792]: I0301 10:26:52.481533 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gqc89/crc-debug-tb4mv" Mar 01 10:26:52 crc kubenswrapper[4792]: I0301 10:26:52.483415 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-gqc89"/"default-dockercfg-cw4kn" Mar 01 10:26:52 crc kubenswrapper[4792]: I0301 10:26:52.518336 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e45184b8-fa23-4abb-8e90-7b490f7d3c04-host\") pod \"crc-debug-tb4mv\" (UID: \"e45184b8-fa23-4abb-8e90-7b490f7d3c04\") " pod="openshift-must-gather-gqc89/crc-debug-tb4mv" Mar 01 10:26:52 crc kubenswrapper[4792]: I0301 10:26:52.518723 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9zzg\" (UniqueName: \"kubernetes.io/projected/e45184b8-fa23-4abb-8e90-7b490f7d3c04-kube-api-access-n9zzg\") pod \"crc-debug-tb4mv\" (UID: \"e45184b8-fa23-4abb-8e90-7b490f7d3c04\") " pod="openshift-must-gather-gqc89/crc-debug-tb4mv" Mar 01 10:26:52 crc kubenswrapper[4792]: I0301 10:26:52.620260 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e45184b8-fa23-4abb-8e90-7b490f7d3c04-host\") pod \"crc-debug-tb4mv\" (UID: \"e45184b8-fa23-4abb-8e90-7b490f7d3c04\") " pod="openshift-must-gather-gqc89/crc-debug-tb4mv" Mar 01 10:26:52 crc kubenswrapper[4792]: I0301 10:26:52.620427 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e45184b8-fa23-4abb-8e90-7b490f7d3c04-host\") pod \"crc-debug-tb4mv\" (UID: \"e45184b8-fa23-4abb-8e90-7b490f7d3c04\") " pod="openshift-must-gather-gqc89/crc-debug-tb4mv" Mar 01 10:26:52 crc kubenswrapper[4792]: I0301 10:26:52.620619 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9zzg\" (UniqueName: \"kubernetes.io/projected/e45184b8-fa23-4abb-8e90-7b490f7d3c04-kube-api-access-n9zzg\") pod \"crc-debug-tb4mv\" (UID: \"e45184b8-fa23-4abb-8e90-7b490f7d3c04\") " pod="openshift-must-gather-gqc89/crc-debug-tb4mv" Mar 01 10:26:52 crc kubenswrapper[4792]: I0301 10:26:52.638823 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9zzg\" (UniqueName: \"kubernetes.io/projected/e45184b8-fa23-4abb-8e90-7b490f7d3c04-kube-api-access-n9zzg\") pod \"crc-debug-tb4mv\" (UID: \"e45184b8-fa23-4abb-8e90-7b490f7d3c04\") " pod="openshift-must-gather-gqc89/crc-debug-tb4mv" Mar 01 10:26:52 crc kubenswrapper[4792]: I0301 10:26:52.799477 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gqc89/crc-debug-tb4mv" Mar 01 10:26:52 crc kubenswrapper[4792]: W0301 10:26:52.834080 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode45184b8_fa23_4abb_8e90_7b490f7d3c04.slice/crio-22ca56a9c95ce18c1e3892347e183b4bae0378f6608a3f7a7e5c37af1cff7576 WatchSource:0}: Error finding container 22ca56a9c95ce18c1e3892347e183b4bae0378f6608a3f7a7e5c37af1cff7576: Status 404 returned error can't find the container with id 22ca56a9c95ce18c1e3892347e183b4bae0378f6608a3f7a7e5c37af1cff7576 Mar 01 10:26:53 crc kubenswrapper[4792]: I0301 10:26:53.087096 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gqc89/crc-debug-tb4mv" event={"ID":"e45184b8-fa23-4abb-8e90-7b490f7d3c04","Type":"ContainerStarted","Data":"9b1ac8b2a6b9a9734637c133cb7e1fc37defae1380b1bd14a5fb31b1efa6e0e5"} Mar 01 10:26:53 crc kubenswrapper[4792]: I0301 10:26:53.087402 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gqc89/crc-debug-tb4mv" event={"ID":"e45184b8-fa23-4abb-8e90-7b490f7d3c04","Type":"ContainerStarted","Data":"22ca56a9c95ce18c1e3892347e183b4bae0378f6608a3f7a7e5c37af1cff7576"} Mar 01 10:26:53 crc kubenswrapper[4792]: I0301 10:26:53.104867 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gqc89/crc-debug-tb4mv" podStartSLOduration=1.104845853 podStartE2EDuration="1.104845853s" podCreationTimestamp="2026-03-01 10:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 10:26:53.098242918 +0000 UTC m=+4742.340122115" watchObservedRunningTime="2026-03-01 10:26:53.104845853 +0000 UTC m=+4742.346725050" Mar 01 10:26:56 crc kubenswrapper[4792]: I0301 10:26:56.409620 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:26:56 crc kubenswrapper[4792]: E0301 10:26:56.410376 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:27:08 crc kubenswrapper[4792]: I0301 10:27:08.409715 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:27:08 crc kubenswrapper[4792]: E0301 10:27:08.412439 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:27:23 crc kubenswrapper[4792]: I0301 10:27:23.409146 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:27:23 crc kubenswrapper[4792]: E0301 10:27:23.411060 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:27:27 crc kubenswrapper[4792]: I0301 10:27:27.377478 4792 generic.go:334] "Generic (PLEG): container finished" podID="e45184b8-fa23-4abb-8e90-7b490f7d3c04" containerID="9b1ac8b2a6b9a9734637c133cb7e1fc37defae1380b1bd14a5fb31b1efa6e0e5" exitCode=0 Mar 01 10:27:27 crc kubenswrapper[4792]: I0301 10:27:27.377585 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gqc89/crc-debug-tb4mv" event={"ID":"e45184b8-fa23-4abb-8e90-7b490f7d3c04","Type":"ContainerDied","Data":"9b1ac8b2a6b9a9734637c133cb7e1fc37defae1380b1bd14a5fb31b1efa6e0e5"} Mar 01 10:27:28 crc kubenswrapper[4792]: I0301 10:27:28.539560 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gqc89/crc-debug-tb4mv" Mar 01 10:27:28 crc kubenswrapper[4792]: I0301 10:27:28.573080 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gqc89/crc-debug-tb4mv"] Mar 01 10:27:28 crc kubenswrapper[4792]: I0301 10:27:28.581565 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gqc89/crc-debug-tb4mv"] Mar 01 10:27:28 crc kubenswrapper[4792]: I0301 10:27:28.633658 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9zzg\" (UniqueName: \"kubernetes.io/projected/e45184b8-fa23-4abb-8e90-7b490f7d3c04-kube-api-access-n9zzg\") pod \"e45184b8-fa23-4abb-8e90-7b490f7d3c04\" (UID: \"e45184b8-fa23-4abb-8e90-7b490f7d3c04\") " Mar 01 10:27:28 crc kubenswrapper[4792]: I0301 10:27:28.633811 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e45184b8-fa23-4abb-8e90-7b490f7d3c04-host\") pod \"e45184b8-fa23-4abb-8e90-7b490f7d3c04\" (UID: \"e45184b8-fa23-4abb-8e90-7b490f7d3c04\") " Mar 01 10:27:28 crc kubenswrapper[4792]: I0301 10:27:28.634481 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e45184b8-fa23-4abb-8e90-7b490f7d3c04-host" (OuterVolumeSpecName: "host") pod "e45184b8-fa23-4abb-8e90-7b490f7d3c04" (UID: "e45184b8-fa23-4abb-8e90-7b490f7d3c04"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 10:27:28 crc kubenswrapper[4792]: I0301 10:27:28.641050 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e45184b8-fa23-4abb-8e90-7b490f7d3c04-kube-api-access-n9zzg" (OuterVolumeSpecName: "kube-api-access-n9zzg") pod "e45184b8-fa23-4abb-8e90-7b490f7d3c04" (UID: "e45184b8-fa23-4abb-8e90-7b490f7d3c04"). InnerVolumeSpecName "kube-api-access-n9zzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:27:28 crc kubenswrapper[4792]: I0301 10:27:28.735943 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9zzg\" (UniqueName: \"kubernetes.io/projected/e45184b8-fa23-4abb-8e90-7b490f7d3c04-kube-api-access-n9zzg\") on node \"crc\" DevicePath \"\"" Mar 01 10:27:28 crc kubenswrapper[4792]: I0301 10:27:28.735987 4792 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e45184b8-fa23-4abb-8e90-7b490f7d3c04-host\") on node \"crc\" DevicePath \"\"" Mar 01 10:27:29 crc kubenswrapper[4792]: I0301 10:27:29.399241 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22ca56a9c95ce18c1e3892347e183b4bae0378f6608a3f7a7e5c37af1cff7576" Mar 01 10:27:29 crc kubenswrapper[4792]: I0301 10:27:29.399531 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gqc89/crc-debug-tb4mv" Mar 01 10:27:29 crc kubenswrapper[4792]: I0301 10:27:29.418978 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e45184b8-fa23-4abb-8e90-7b490f7d3c04" path="/var/lib/kubelet/pods/e45184b8-fa23-4abb-8e90-7b490f7d3c04/volumes" Mar 01 10:27:29 crc kubenswrapper[4792]: I0301 10:27:29.761083 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gqc89/crc-debug-b5clz"] Mar 01 10:27:29 crc kubenswrapper[4792]: E0301 10:27:29.761741 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e45184b8-fa23-4abb-8e90-7b490f7d3c04" containerName="container-00" Mar 01 10:27:29 crc kubenswrapper[4792]: I0301 10:27:29.761752 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e45184b8-fa23-4abb-8e90-7b490f7d3c04" containerName="container-00" Mar 01 10:27:29 crc kubenswrapper[4792]: I0301 10:27:29.761972 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e45184b8-fa23-4abb-8e90-7b490f7d3c04" containerName="container-00" Mar 01 10:27:29 crc kubenswrapper[4792]: I0301 10:27:29.762564 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gqc89/crc-debug-b5clz" Mar 01 10:27:29 crc kubenswrapper[4792]: I0301 10:27:29.766103 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-gqc89"/"default-dockercfg-cw4kn" Mar 01 10:27:29 crc kubenswrapper[4792]: I0301 10:27:29.858143 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/906114a6-330e-47e1-a2d5-b2629604ec9b-host\") pod \"crc-debug-b5clz\" (UID: \"906114a6-330e-47e1-a2d5-b2629604ec9b\") " pod="openshift-must-gather-gqc89/crc-debug-b5clz" Mar 01 10:27:29 crc kubenswrapper[4792]: I0301 10:27:29.858214 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4s4x\" (UniqueName: \"kubernetes.io/projected/906114a6-330e-47e1-a2d5-b2629604ec9b-kube-api-access-d4s4x\") pod \"crc-debug-b5clz\" (UID: \"906114a6-330e-47e1-a2d5-b2629604ec9b\") " pod="openshift-must-gather-gqc89/crc-debug-b5clz" Mar 01 10:27:29 crc kubenswrapper[4792]: I0301 10:27:29.959973 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/906114a6-330e-47e1-a2d5-b2629604ec9b-host\") pod \"crc-debug-b5clz\" (UID: \"906114a6-330e-47e1-a2d5-b2629604ec9b\") " pod="openshift-must-gather-gqc89/crc-debug-b5clz" Mar 01 10:27:29 crc kubenswrapper[4792]: I0301 10:27:29.960055 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4s4x\" (UniqueName: \"kubernetes.io/projected/906114a6-330e-47e1-a2d5-b2629604ec9b-kube-api-access-d4s4x\") pod \"crc-debug-b5clz\" (UID: \"906114a6-330e-47e1-a2d5-b2629604ec9b\") " pod="openshift-must-gather-gqc89/crc-debug-b5clz" Mar 01 10:27:29 crc kubenswrapper[4792]: I0301 10:27:29.960348 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/906114a6-330e-47e1-a2d5-b2629604ec9b-host\") pod \"crc-debug-b5clz\" (UID: \"906114a6-330e-47e1-a2d5-b2629604ec9b\") " pod="openshift-must-gather-gqc89/crc-debug-b5clz" Mar 01 10:27:29 crc kubenswrapper[4792]: I0301 10:27:29.977729 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4s4x\" (UniqueName: \"kubernetes.io/projected/906114a6-330e-47e1-a2d5-b2629604ec9b-kube-api-access-d4s4x\") pod \"crc-debug-b5clz\" (UID: \"906114a6-330e-47e1-a2d5-b2629604ec9b\") " pod="openshift-must-gather-gqc89/crc-debug-b5clz" Mar 01 10:27:30 crc kubenswrapper[4792]: I0301 10:27:30.080388 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gqc89/crc-debug-b5clz" Mar 01 10:27:30 crc kubenswrapper[4792]: I0301 10:27:30.408370 4792 generic.go:334] "Generic (PLEG): container finished" podID="906114a6-330e-47e1-a2d5-b2629604ec9b" containerID="44972f8cdfee40aaa4b7d768644c72d5ce1a862062e2e025755a87475cc93e75" exitCode=0 Mar 01 10:27:30 crc kubenswrapper[4792]: I0301 10:27:30.408464 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gqc89/crc-debug-b5clz" event={"ID":"906114a6-330e-47e1-a2d5-b2629604ec9b","Type":"ContainerDied","Data":"44972f8cdfee40aaa4b7d768644c72d5ce1a862062e2e025755a87475cc93e75"} Mar 01 10:27:30 crc kubenswrapper[4792]: I0301 10:27:30.408643 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gqc89/crc-debug-b5clz" event={"ID":"906114a6-330e-47e1-a2d5-b2629604ec9b","Type":"ContainerStarted","Data":"a698a724eea7987fdc30fb2d5e78695ac28cddc639344b0f696d184f93a2fc3d"} Mar 01 10:27:30 crc kubenswrapper[4792]: I0301 10:27:30.841307 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gqc89/crc-debug-b5clz"] Mar 01 10:27:30 crc kubenswrapper[4792]: I0301 10:27:30.850279 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gqc89/crc-debug-b5clz"] Mar 01 10:27:31 crc kubenswrapper[4792]: I0301 10:27:31.509710 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gqc89/crc-debug-b5clz" Mar 01 10:27:31 crc kubenswrapper[4792]: I0301 10:27:31.632494 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4s4x\" (UniqueName: \"kubernetes.io/projected/906114a6-330e-47e1-a2d5-b2629604ec9b-kube-api-access-d4s4x\") pod \"906114a6-330e-47e1-a2d5-b2629604ec9b\" (UID: \"906114a6-330e-47e1-a2d5-b2629604ec9b\") " Mar 01 10:27:31 crc kubenswrapper[4792]: I0301 10:27:31.632560 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/906114a6-330e-47e1-a2d5-b2629604ec9b-host\") pod \"906114a6-330e-47e1-a2d5-b2629604ec9b\" (UID: \"906114a6-330e-47e1-a2d5-b2629604ec9b\") " Mar 01 10:27:31 crc kubenswrapper[4792]: I0301 10:27:31.632656 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/906114a6-330e-47e1-a2d5-b2629604ec9b-host" (OuterVolumeSpecName: "host") pod "906114a6-330e-47e1-a2d5-b2629604ec9b" (UID: "906114a6-330e-47e1-a2d5-b2629604ec9b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 10:27:31 crc kubenswrapper[4792]: I0301 10:27:31.633461 4792 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/906114a6-330e-47e1-a2d5-b2629604ec9b-host\") on node \"crc\" DevicePath \"\"" Mar 01 10:27:31 crc kubenswrapper[4792]: I0301 10:27:31.660579 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/906114a6-330e-47e1-a2d5-b2629604ec9b-kube-api-access-d4s4x" (OuterVolumeSpecName: "kube-api-access-d4s4x") pod "906114a6-330e-47e1-a2d5-b2629604ec9b" (UID: "906114a6-330e-47e1-a2d5-b2629604ec9b"). InnerVolumeSpecName "kube-api-access-d4s4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:27:31 crc kubenswrapper[4792]: I0301 10:27:31.735718 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4s4x\" (UniqueName: \"kubernetes.io/projected/906114a6-330e-47e1-a2d5-b2629604ec9b-kube-api-access-d4s4x\") on node \"crc\" DevicePath \"\"" Mar 01 10:27:32 crc kubenswrapper[4792]: I0301 10:27:32.256300 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gqc89/crc-debug-f7z2b"] Mar 01 10:27:32 crc kubenswrapper[4792]: E0301 10:27:32.256977 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="906114a6-330e-47e1-a2d5-b2629604ec9b" containerName="container-00" Mar 01 10:27:32 crc kubenswrapper[4792]: I0301 10:27:32.256990 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="906114a6-330e-47e1-a2d5-b2629604ec9b" containerName="container-00" Mar 01 10:27:32 crc kubenswrapper[4792]: I0301 10:27:32.257195 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="906114a6-330e-47e1-a2d5-b2629604ec9b" containerName="container-00" Mar 01 10:27:32 crc kubenswrapper[4792]: I0301 10:27:32.257763 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gqc89/crc-debug-f7z2b" Mar 01 10:27:32 crc kubenswrapper[4792]: I0301 10:27:32.429021 4792 scope.go:117] "RemoveContainer" containerID="44972f8cdfee40aaa4b7d768644c72d5ce1a862062e2e025755a87475cc93e75" Mar 01 10:27:32 crc kubenswrapper[4792]: I0301 10:27:32.429250 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gqc89/crc-debug-b5clz" Mar 01 10:27:32 crc kubenswrapper[4792]: I0301 10:27:32.447467 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76rpw\" (UniqueName: \"kubernetes.io/projected/87295b8e-2de6-45e5-8a8c-67223695843f-kube-api-access-76rpw\") pod \"crc-debug-f7z2b\" (UID: \"87295b8e-2de6-45e5-8a8c-67223695843f\") " pod="openshift-must-gather-gqc89/crc-debug-f7z2b" Mar 01 10:27:32 crc kubenswrapper[4792]: I0301 10:27:32.447855 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/87295b8e-2de6-45e5-8a8c-67223695843f-host\") pod \"crc-debug-f7z2b\" (UID: \"87295b8e-2de6-45e5-8a8c-67223695843f\") " pod="openshift-must-gather-gqc89/crc-debug-f7z2b" Mar 01 10:27:32 crc kubenswrapper[4792]: I0301 10:27:32.550147 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76rpw\" (UniqueName: \"kubernetes.io/projected/87295b8e-2de6-45e5-8a8c-67223695843f-kube-api-access-76rpw\") pod \"crc-debug-f7z2b\" (UID: \"87295b8e-2de6-45e5-8a8c-67223695843f\") " pod="openshift-must-gather-gqc89/crc-debug-f7z2b" Mar 01 10:27:32 crc kubenswrapper[4792]: I0301 10:27:32.550299 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/87295b8e-2de6-45e5-8a8c-67223695843f-host\") pod \"crc-debug-f7z2b\" (UID: \"87295b8e-2de6-45e5-8a8c-67223695843f\") " pod="openshift-must-gather-gqc89/crc-debug-f7z2b" Mar 01 10:27:32 crc kubenswrapper[4792]: I0301 10:27:32.550386 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/87295b8e-2de6-45e5-8a8c-67223695843f-host\") pod \"crc-debug-f7z2b\" (UID: \"87295b8e-2de6-45e5-8a8c-67223695843f\") " pod="openshift-must-gather-gqc89/crc-debug-f7z2b" Mar 01 10:27:32 crc kubenswrapper[4792]: I0301 10:27:32.861436 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76rpw\" (UniqueName: \"kubernetes.io/projected/87295b8e-2de6-45e5-8a8c-67223695843f-kube-api-access-76rpw\") pod \"crc-debug-f7z2b\" (UID: \"87295b8e-2de6-45e5-8a8c-67223695843f\") " pod="openshift-must-gather-gqc89/crc-debug-f7z2b" Mar 01 10:27:32 crc kubenswrapper[4792]: I0301 10:27:32.878415 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gqc89/crc-debug-f7z2b" Mar 01 10:27:32 crc kubenswrapper[4792]: W0301 10:27:32.912854 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87295b8e_2de6_45e5_8a8c_67223695843f.slice/crio-5043b64d50537b6cfaf21fbd5e8cfde7f34c613a4ac57d32c2db7b8106aab901 WatchSource:0}: Error finding container 5043b64d50537b6cfaf21fbd5e8cfde7f34c613a4ac57d32c2db7b8106aab901: Status 404 returned error can't find the container with id 5043b64d50537b6cfaf21fbd5e8cfde7f34c613a4ac57d32c2db7b8106aab901 Mar 01 10:27:33 crc kubenswrapper[4792]: I0301 10:27:33.419122 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="906114a6-330e-47e1-a2d5-b2629604ec9b" path="/var/lib/kubelet/pods/906114a6-330e-47e1-a2d5-b2629604ec9b/volumes" Mar 01 10:27:33 crc kubenswrapper[4792]: I0301 10:27:33.438548 4792 generic.go:334] "Generic (PLEG): container finished" podID="87295b8e-2de6-45e5-8a8c-67223695843f" containerID="f8e7a7cf72954e5a371a752e566b57991977d10fab6f499762b2d8f952868a75" exitCode=0 Mar 01 10:27:33 crc kubenswrapper[4792]: I0301 10:27:33.438586 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gqc89/crc-debug-f7z2b" event={"ID":"87295b8e-2de6-45e5-8a8c-67223695843f","Type":"ContainerDied","Data":"f8e7a7cf72954e5a371a752e566b57991977d10fab6f499762b2d8f952868a75"} Mar 01 10:27:33 crc kubenswrapper[4792]: I0301 10:27:33.438648 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gqc89/crc-debug-f7z2b" event={"ID":"87295b8e-2de6-45e5-8a8c-67223695843f","Type":"ContainerStarted","Data":"5043b64d50537b6cfaf21fbd5e8cfde7f34c613a4ac57d32c2db7b8106aab901"} Mar 01 10:27:33 crc kubenswrapper[4792]: I0301 10:27:33.499769 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gqc89/crc-debug-f7z2b"] Mar 01 10:27:33 crc kubenswrapper[4792]: I0301 10:27:33.509944 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gqc89/crc-debug-f7z2b"] Mar 01 10:27:34 crc kubenswrapper[4792]: I0301 10:27:34.545107 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gqc89/crc-debug-f7z2b" Mar 01 10:27:34 crc kubenswrapper[4792]: I0301 10:27:34.697133 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/87295b8e-2de6-45e5-8a8c-67223695843f-host\") pod \"87295b8e-2de6-45e5-8a8c-67223695843f\" (UID: \"87295b8e-2de6-45e5-8a8c-67223695843f\") " Mar 01 10:27:34 crc kubenswrapper[4792]: I0301 10:27:34.697254 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76rpw\" (UniqueName: \"kubernetes.io/projected/87295b8e-2de6-45e5-8a8c-67223695843f-kube-api-access-76rpw\") pod \"87295b8e-2de6-45e5-8a8c-67223695843f\" (UID: \"87295b8e-2de6-45e5-8a8c-67223695843f\") " Mar 01 10:27:34 crc kubenswrapper[4792]: I0301 10:27:34.698042 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87295b8e-2de6-45e5-8a8c-67223695843f-host" (OuterVolumeSpecName: "host") pod "87295b8e-2de6-45e5-8a8c-67223695843f" (UID: "87295b8e-2de6-45e5-8a8c-67223695843f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 01 10:27:34 crc kubenswrapper[4792]: I0301 10:27:34.703946 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87295b8e-2de6-45e5-8a8c-67223695843f-kube-api-access-76rpw" (OuterVolumeSpecName: "kube-api-access-76rpw") pod "87295b8e-2de6-45e5-8a8c-67223695843f" (UID: "87295b8e-2de6-45e5-8a8c-67223695843f"). InnerVolumeSpecName "kube-api-access-76rpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:27:34 crc kubenswrapper[4792]: I0301 10:27:34.799839 4792 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/87295b8e-2de6-45e5-8a8c-67223695843f-host\") on node \"crc\" DevicePath \"\"" Mar 01 10:27:34 crc kubenswrapper[4792]: I0301 10:27:34.799874 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76rpw\" (UniqueName: \"kubernetes.io/projected/87295b8e-2de6-45e5-8a8c-67223695843f-kube-api-access-76rpw\") on node \"crc\" DevicePath \"\"" Mar 01 10:27:35 crc kubenswrapper[4792]: I0301 10:27:35.426344 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87295b8e-2de6-45e5-8a8c-67223695843f" path="/var/lib/kubelet/pods/87295b8e-2de6-45e5-8a8c-67223695843f/volumes" Mar 01 10:27:35 crc kubenswrapper[4792]: I0301 10:27:35.460099 4792 scope.go:117] "RemoveContainer" containerID="f8e7a7cf72954e5a371a752e566b57991977d10fab6f499762b2d8f952868a75" Mar 01 10:27:35 crc kubenswrapper[4792]: I0301 10:27:35.460361 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gqc89/crc-debug-f7z2b" Mar 01 10:27:36 crc kubenswrapper[4792]: I0301 10:27:36.410226 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:27:36 crc kubenswrapper[4792]: E0301 10:27:36.410822 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:27:49 crc kubenswrapper[4792]: I0301 10:27:49.409104 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:27:49 crc kubenswrapper[4792]: E0301 10:27:49.409883 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:27:49 crc kubenswrapper[4792]: I0301 10:27:49.951514 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cxhh6"] Mar 01 10:27:49 crc kubenswrapper[4792]: E0301 10:27:49.952582 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87295b8e-2de6-45e5-8a8c-67223695843f" containerName="container-00" Mar 01 10:27:49 crc kubenswrapper[4792]: I0301 10:27:49.952695 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="87295b8e-2de6-45e5-8a8c-67223695843f" containerName="container-00" Mar 01 10:27:49 crc kubenswrapper[4792]: I0301 10:27:49.953064 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="87295b8e-2de6-45e5-8a8c-67223695843f" containerName="container-00" Mar 01 10:27:49 crc kubenswrapper[4792]: I0301 10:27:49.954835 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cxhh6" Mar 01 10:27:49 crc kubenswrapper[4792]: I0301 10:27:49.971505 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cxhh6"] Mar 01 10:27:50 crc kubenswrapper[4792]: I0301 10:27:50.026410 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12314c82-8a7f-466f-939a-158a53420f72-utilities\") pod \"redhat-operators-cxhh6\" (UID: \"12314c82-8a7f-466f-939a-158a53420f72\") " pod="openshift-marketplace/redhat-operators-cxhh6" Mar 01 10:27:50 crc kubenswrapper[4792]: I0301 10:27:50.026510 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dxk6\" (UniqueName: \"kubernetes.io/projected/12314c82-8a7f-466f-939a-158a53420f72-kube-api-access-4dxk6\") pod \"redhat-operators-cxhh6\" (UID: \"12314c82-8a7f-466f-939a-158a53420f72\") " pod="openshift-marketplace/redhat-operators-cxhh6" Mar 01 10:27:50 crc kubenswrapper[4792]: I0301 10:27:50.026586 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12314c82-8a7f-466f-939a-158a53420f72-catalog-content\") pod \"redhat-operators-cxhh6\" (UID: \"12314c82-8a7f-466f-939a-158a53420f72\") " pod="openshift-marketplace/redhat-operators-cxhh6" Mar 01 10:27:50 crc kubenswrapper[4792]: I0301 10:27:50.127934 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12314c82-8a7f-466f-939a-158a53420f72-catalog-content\") pod \"redhat-operators-cxhh6\" (UID: \"12314c82-8a7f-466f-939a-158a53420f72\") " pod="openshift-marketplace/redhat-operators-cxhh6" Mar 01 10:27:50 crc kubenswrapper[4792]: I0301 10:27:50.128057 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12314c82-8a7f-466f-939a-158a53420f72-utilities\") pod \"redhat-operators-cxhh6\" (UID: \"12314c82-8a7f-466f-939a-158a53420f72\") " pod="openshift-marketplace/redhat-operators-cxhh6" Mar 01 10:27:50 crc kubenswrapper[4792]: I0301 10:27:50.128125 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dxk6\" (UniqueName: \"kubernetes.io/projected/12314c82-8a7f-466f-939a-158a53420f72-kube-api-access-4dxk6\") pod \"redhat-operators-cxhh6\" (UID: \"12314c82-8a7f-466f-939a-158a53420f72\") " pod="openshift-marketplace/redhat-operators-cxhh6" Mar 01 10:27:50 crc kubenswrapper[4792]: I0301 10:27:50.128507 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12314c82-8a7f-466f-939a-158a53420f72-catalog-content\") pod \"redhat-operators-cxhh6\" (UID: \"12314c82-8a7f-466f-939a-158a53420f72\") " pod="openshift-marketplace/redhat-operators-cxhh6" Mar 01 10:27:50 crc kubenswrapper[4792]: I0301 10:27:50.128551 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12314c82-8a7f-466f-939a-158a53420f72-utilities\") pod \"redhat-operators-cxhh6\" (UID: \"12314c82-8a7f-466f-939a-158a53420f72\") " pod="openshift-marketplace/redhat-operators-cxhh6" Mar 01 10:27:50 crc kubenswrapper[4792]: I0301 10:27:50.154324 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dxk6\" (UniqueName: \"kubernetes.io/projected/12314c82-8a7f-466f-939a-158a53420f72-kube-api-access-4dxk6\") pod \"redhat-operators-cxhh6\" (UID: \"12314c82-8a7f-466f-939a-158a53420f72\") " pod="openshift-marketplace/redhat-operators-cxhh6" Mar 01 10:27:50 crc kubenswrapper[4792]: I0301 10:27:50.311193 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cxhh6" Mar 01 10:27:50 crc kubenswrapper[4792]: I0301 10:27:50.820461 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cxhh6"] Mar 01 10:27:51 crc kubenswrapper[4792]: I0301 10:27:51.629382 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxhh6" event={"ID":"12314c82-8a7f-466f-939a-158a53420f72","Type":"ContainerStarted","Data":"1118d4695af3d42785b70cb2f47d1de1e099600c2214f5c7391a3af697aaf1ab"} Mar 01 10:27:52 crc kubenswrapper[4792]: I0301 10:27:52.637630 4792 generic.go:334] "Generic (PLEG): container finished" podID="12314c82-8a7f-466f-939a-158a53420f72" containerID="8691f7d7c721a269193a4a6cb58015f74a7ec254848c8d4d97fca1653885ba31" exitCode=0 Mar 01 10:27:52 crc kubenswrapper[4792]: I0301 10:27:52.637773 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxhh6" event={"ID":"12314c82-8a7f-466f-939a-158a53420f72","Type":"ContainerDied","Data":"8691f7d7c721a269193a4a6cb58015f74a7ec254848c8d4d97fca1653885ba31"} Mar 01 10:27:53 crc kubenswrapper[4792]: I0301 10:27:53.646646 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxhh6" event={"ID":"12314c82-8a7f-466f-939a-158a53420f72","Type":"ContainerStarted","Data":"a8620ea20d739c51308b4b8ed7430cafe7ba9fd563b914458c4b6bd618b02c6b"} Mar 01 10:27:58 crc kubenswrapper[4792]: I0301 10:27:58.686007 4792 generic.go:334] "Generic (PLEG): container finished" podID="12314c82-8a7f-466f-939a-158a53420f72" containerID="a8620ea20d739c51308b4b8ed7430cafe7ba9fd563b914458c4b6bd618b02c6b" exitCode=0 Mar 01 10:27:58 crc kubenswrapper[4792]: I0301 10:27:58.686631 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxhh6" event={"ID":"12314c82-8a7f-466f-939a-158a53420f72","Type":"ContainerDied","Data":"a8620ea20d739c51308b4b8ed7430cafe7ba9fd563b914458c4b6bd618b02c6b"} Mar 01 10:27:59 crc kubenswrapper[4792]: I0301 10:27:59.696474 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxhh6" event={"ID":"12314c82-8a7f-466f-939a-158a53420f72","Type":"ContainerStarted","Data":"1ef091670c5fdfa64e056797c2116c2378a573407bcb8db5089d780fe9884505"} Mar 01 10:27:59 crc kubenswrapper[4792]: I0301 10:27:59.715416 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cxhh6" podStartSLOduration=4.252022057 podStartE2EDuration="10.715395777s" podCreationTimestamp="2026-03-01 10:27:49 +0000 UTC" firstStartedPulling="2026-03-01 10:27:52.639654778 +0000 UTC m=+4801.881533975" lastFinishedPulling="2026-03-01 10:27:59.103028498 +0000 UTC m=+4808.344907695" observedRunningTime="2026-03-01 10:27:59.712523085 +0000 UTC m=+4808.954402282" watchObservedRunningTime="2026-03-01 10:27:59.715395777 +0000 UTC m=+4808.957274974" Mar 01 10:28:00 crc kubenswrapper[4792]: I0301 10:28:00.152118 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539348-5hvql"] Mar 01 10:28:00 crc kubenswrapper[4792]: I0301 10:28:00.153979 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539348-5hvql" Mar 01 10:28:00 crc kubenswrapper[4792]: I0301 10:28:00.156009 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:28:00 crc kubenswrapper[4792]: I0301 10:28:00.156460 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:28:00 crc kubenswrapper[4792]: I0301 10:28:00.156826 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:28:00 crc kubenswrapper[4792]: I0301 10:28:00.175046 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539348-5hvql"] Mar 01 10:28:00 crc kubenswrapper[4792]: I0301 10:28:00.247928 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffmwm\" (UniqueName: \"kubernetes.io/projected/5867157f-b16a-460c-afc4-0981a4d8ee43-kube-api-access-ffmwm\") pod \"auto-csr-approver-29539348-5hvql\" (UID: \"5867157f-b16a-460c-afc4-0981a4d8ee43\") " pod="openshift-infra/auto-csr-approver-29539348-5hvql" Mar 01 10:28:00 crc kubenswrapper[4792]: I0301 10:28:00.311570 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cxhh6" Mar 01 10:28:00 crc kubenswrapper[4792]: I0301 10:28:00.311625 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cxhh6" Mar 01 10:28:00 crc kubenswrapper[4792]: I0301 10:28:00.356397 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffmwm\" (UniqueName: \"kubernetes.io/projected/5867157f-b16a-460c-afc4-0981a4d8ee43-kube-api-access-ffmwm\") pod \"auto-csr-approver-29539348-5hvql\" (UID: \"5867157f-b16a-460c-afc4-0981a4d8ee43\") " pod="openshift-infra/auto-csr-approver-29539348-5hvql" Mar 01 10:28:00 crc kubenswrapper[4792]: I0301 10:28:00.374814 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffmwm\" (UniqueName: \"kubernetes.io/projected/5867157f-b16a-460c-afc4-0981a4d8ee43-kube-api-access-ffmwm\") pod \"auto-csr-approver-29539348-5hvql\" (UID: \"5867157f-b16a-460c-afc4-0981a4d8ee43\") " pod="openshift-infra/auto-csr-approver-29539348-5hvql" Mar 01 10:28:00 crc kubenswrapper[4792]: I0301 10:28:00.409047 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:28:00 crc kubenswrapper[4792]: E0301 10:28:00.409317 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:28:00 crc kubenswrapper[4792]: I0301 10:28:00.485361 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539348-5hvql" Mar 01 10:28:00 crc kubenswrapper[4792]: I0301 10:28:00.892401 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539348-5hvql"] Mar 01 10:28:01 crc kubenswrapper[4792]: I0301 10:28:01.365919 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cxhh6" podUID="12314c82-8a7f-466f-939a-158a53420f72" containerName="registry-server" probeResult="failure" output=< Mar 01 10:28:01 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 01 10:28:01 crc kubenswrapper[4792]: > Mar 01 10:28:01 crc kubenswrapper[4792]: I0301 10:28:01.723534 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539348-5hvql" event={"ID":"5867157f-b16a-460c-afc4-0981a4d8ee43","Type":"ContainerStarted","Data":"8c1e2f3178dea59ab178a2142d754bd6d93e0069c8e569ad774e0d802ed3549e"} Mar 01 10:28:02 crc kubenswrapper[4792]: I0301 10:28:02.733248 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539348-5hvql" event={"ID":"5867157f-b16a-460c-afc4-0981a4d8ee43","Type":"ContainerStarted","Data":"9237ca7f55124284e9b80295a7be3e3ee8987057df25870f237eb05840050933"} Mar 01 10:28:02 crc kubenswrapper[4792]: I0301 10:28:02.747066 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29539348-5hvql" podStartSLOduration=1.726261451 podStartE2EDuration="2.747044708s" podCreationTimestamp="2026-03-01 10:28:00 +0000 UTC" firstStartedPulling="2026-03-01 10:28:00.901929289 +0000 UTC m=+4810.143808486" lastFinishedPulling="2026-03-01 10:28:01.922712546 +0000 UTC m=+4811.164591743" observedRunningTime="2026-03-01 10:28:02.745990352 +0000 UTC m=+4811.987869549" watchObservedRunningTime="2026-03-01 10:28:02.747044708 +0000 UTC m=+4811.988923915" Mar 01 10:28:03 crc kubenswrapper[4792]: I0301 10:28:03.742770 4792 generic.go:334] "Generic (PLEG): container finished" podID="5867157f-b16a-460c-afc4-0981a4d8ee43" containerID="9237ca7f55124284e9b80295a7be3e3ee8987057df25870f237eb05840050933" exitCode=0 Mar 01 10:28:03 crc kubenswrapper[4792]: I0301 10:28:03.743201 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539348-5hvql" event={"ID":"5867157f-b16a-460c-afc4-0981a4d8ee43","Type":"ContainerDied","Data":"9237ca7f55124284e9b80295a7be3e3ee8987057df25870f237eb05840050933"} Mar 01 10:28:05 crc kubenswrapper[4792]: I0301 10:28:05.110942 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539348-5hvql" Mar 01 10:28:05 crc kubenswrapper[4792]: I0301 10:28:05.174959 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffmwm\" (UniqueName: \"kubernetes.io/projected/5867157f-b16a-460c-afc4-0981a4d8ee43-kube-api-access-ffmwm\") pod \"5867157f-b16a-460c-afc4-0981a4d8ee43\" (UID: \"5867157f-b16a-460c-afc4-0981a4d8ee43\") " Mar 01 10:28:05 crc kubenswrapper[4792]: I0301 10:28:05.180533 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5867157f-b16a-460c-afc4-0981a4d8ee43-kube-api-access-ffmwm" (OuterVolumeSpecName: "kube-api-access-ffmwm") pod "5867157f-b16a-460c-afc4-0981a4d8ee43" (UID: "5867157f-b16a-460c-afc4-0981a4d8ee43"). InnerVolumeSpecName "kube-api-access-ffmwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:28:05 crc kubenswrapper[4792]: I0301 10:28:05.276840 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffmwm\" (UniqueName: \"kubernetes.io/projected/5867157f-b16a-460c-afc4-0981a4d8ee43-kube-api-access-ffmwm\") on node \"crc\" DevicePath \"\"" Mar 01 10:28:05 crc kubenswrapper[4792]: I0301 10:28:05.760799 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539348-5hvql" event={"ID":"5867157f-b16a-460c-afc4-0981a4d8ee43","Type":"ContainerDied","Data":"8c1e2f3178dea59ab178a2142d754bd6d93e0069c8e569ad774e0d802ed3549e"} Mar 01 10:28:05 crc kubenswrapper[4792]: I0301 10:28:05.761064 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c1e2f3178dea59ab178a2142d754bd6d93e0069c8e569ad774e0d802ed3549e" Mar 01 10:28:05 crc kubenswrapper[4792]: I0301 10:28:05.760866 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539348-5hvql" Mar 01 10:28:05 crc kubenswrapper[4792]: I0301 10:28:05.821117 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539342-5t24p"] Mar 01 10:28:05 crc kubenswrapper[4792]: I0301 10:28:05.832503 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539342-5t24p"] Mar 01 10:28:07 crc kubenswrapper[4792]: I0301 10:28:07.418237 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="821a550c-e6ff-4517-a306-11ea497be759" path="/var/lib/kubelet/pods/821a550c-e6ff-4517-a306-11ea497be759/volumes" Mar 01 10:28:10 crc kubenswrapper[4792]: I0301 10:28:10.361863 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cxhh6" Mar 01 10:28:10 crc kubenswrapper[4792]: I0301 10:28:10.424823 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cxhh6" Mar 01 10:28:10 crc kubenswrapper[4792]: I0301 10:28:10.606863 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cxhh6"] Mar 01 10:28:11 crc kubenswrapper[4792]: I0301 10:28:11.809429 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cxhh6" podUID="12314c82-8a7f-466f-939a-158a53420f72" containerName="registry-server" containerID="cri-o://1ef091670c5fdfa64e056797c2116c2378a573407bcb8db5089d780fe9884505" gracePeriod=2 Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.295654 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cxhh6" Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.425831 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12314c82-8a7f-466f-939a-158a53420f72-catalog-content\") pod \"12314c82-8a7f-466f-939a-158a53420f72\" (UID: \"12314c82-8a7f-466f-939a-158a53420f72\") " Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.426144 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dxk6\" (UniqueName: \"kubernetes.io/projected/12314c82-8a7f-466f-939a-158a53420f72-kube-api-access-4dxk6\") pod \"12314c82-8a7f-466f-939a-158a53420f72\" (UID: \"12314c82-8a7f-466f-939a-158a53420f72\") " Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.426190 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12314c82-8a7f-466f-939a-158a53420f72-utilities\") pod \"12314c82-8a7f-466f-939a-158a53420f72\" (UID: \"12314c82-8a7f-466f-939a-158a53420f72\") " Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.426664 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12314c82-8a7f-466f-939a-158a53420f72-utilities" (OuterVolumeSpecName: "utilities") pod "12314c82-8a7f-466f-939a-158a53420f72" (UID: "12314c82-8a7f-466f-939a-158a53420f72"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.431422 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12314c82-8a7f-466f-939a-158a53420f72-kube-api-access-4dxk6" (OuterVolumeSpecName: "kube-api-access-4dxk6") pod "12314c82-8a7f-466f-939a-158a53420f72" (UID: "12314c82-8a7f-466f-939a-158a53420f72"). InnerVolumeSpecName "kube-api-access-4dxk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.528205 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dxk6\" (UniqueName: \"kubernetes.io/projected/12314c82-8a7f-466f-939a-158a53420f72-kube-api-access-4dxk6\") on node \"crc\" DevicePath \"\"" Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.528237 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12314c82-8a7f-466f-939a-158a53420f72-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.568754 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12314c82-8a7f-466f-939a-158a53420f72-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "12314c82-8a7f-466f-939a-158a53420f72" (UID: "12314c82-8a7f-466f-939a-158a53420f72"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.630270 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12314c82-8a7f-466f-939a-158a53420f72-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.820584 4792 generic.go:334] "Generic (PLEG): container finished" podID="12314c82-8a7f-466f-939a-158a53420f72" containerID="1ef091670c5fdfa64e056797c2116c2378a573407bcb8db5089d780fe9884505" exitCode=0 Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.820663 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxhh6" event={"ID":"12314c82-8a7f-466f-939a-158a53420f72","Type":"ContainerDied","Data":"1ef091670c5fdfa64e056797c2116c2378a573407bcb8db5089d780fe9884505"} Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.820666 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cxhh6" Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.820705 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxhh6" event={"ID":"12314c82-8a7f-466f-939a-158a53420f72","Type":"ContainerDied","Data":"1118d4695af3d42785b70cb2f47d1de1e099600c2214f5c7391a3af697aaf1ab"} Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.820766 4792 scope.go:117] "RemoveContainer" containerID="1ef091670c5fdfa64e056797c2116c2378a573407bcb8db5089d780fe9884505" Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.854443 4792 scope.go:117] "RemoveContainer" containerID="a8620ea20d739c51308b4b8ed7430cafe7ba9fd563b914458c4b6bd618b02c6b" Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.883253 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cxhh6"] Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.891848 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cxhh6"] Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.898625 4792 scope.go:117] "RemoveContainer" containerID="8691f7d7c721a269193a4a6cb58015f74a7ec254848c8d4d97fca1653885ba31" Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.951669 4792 scope.go:117] "RemoveContainer" containerID="1ef091670c5fdfa64e056797c2116c2378a573407bcb8db5089d780fe9884505" Mar 01 10:28:12 crc kubenswrapper[4792]: E0301 10:28:12.952168 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ef091670c5fdfa64e056797c2116c2378a573407bcb8db5089d780fe9884505\": container with ID starting with 1ef091670c5fdfa64e056797c2116c2378a573407bcb8db5089d780fe9884505 not found: ID does not exist" containerID="1ef091670c5fdfa64e056797c2116c2378a573407bcb8db5089d780fe9884505" Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.952206 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ef091670c5fdfa64e056797c2116c2378a573407bcb8db5089d780fe9884505"} err="failed to get container status \"1ef091670c5fdfa64e056797c2116c2378a573407bcb8db5089d780fe9884505\": rpc error: code = NotFound desc = could not find container \"1ef091670c5fdfa64e056797c2116c2378a573407bcb8db5089d780fe9884505\": container with ID starting with 1ef091670c5fdfa64e056797c2116c2378a573407bcb8db5089d780fe9884505 not found: ID does not exist" Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.952229 4792 scope.go:117] "RemoveContainer" containerID="a8620ea20d739c51308b4b8ed7430cafe7ba9fd563b914458c4b6bd618b02c6b" Mar 01 10:28:12 crc kubenswrapper[4792]: E0301 10:28:12.952449 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8620ea20d739c51308b4b8ed7430cafe7ba9fd563b914458c4b6bd618b02c6b\": container with ID starting with a8620ea20d739c51308b4b8ed7430cafe7ba9fd563b914458c4b6bd618b02c6b not found: ID does not exist" containerID="a8620ea20d739c51308b4b8ed7430cafe7ba9fd563b914458c4b6bd618b02c6b" Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.952480 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8620ea20d739c51308b4b8ed7430cafe7ba9fd563b914458c4b6bd618b02c6b"} err="failed to get container status \"a8620ea20d739c51308b4b8ed7430cafe7ba9fd563b914458c4b6bd618b02c6b\": rpc error: code = NotFound desc = could not find container \"a8620ea20d739c51308b4b8ed7430cafe7ba9fd563b914458c4b6bd618b02c6b\": container with ID starting with a8620ea20d739c51308b4b8ed7430cafe7ba9fd563b914458c4b6bd618b02c6b not found: ID does not exist" Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.952500 4792 scope.go:117] "RemoveContainer" containerID="8691f7d7c721a269193a4a6cb58015f74a7ec254848c8d4d97fca1653885ba31" Mar 01 10:28:12 crc kubenswrapper[4792]: E0301 10:28:12.952704 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8691f7d7c721a269193a4a6cb58015f74a7ec254848c8d4d97fca1653885ba31\": container with ID starting with 8691f7d7c721a269193a4a6cb58015f74a7ec254848c8d4d97fca1653885ba31 not found: ID does not exist" containerID="8691f7d7c721a269193a4a6cb58015f74a7ec254848c8d4d97fca1653885ba31" Mar 01 10:28:12 crc kubenswrapper[4792]: I0301 10:28:12.952726 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8691f7d7c721a269193a4a6cb58015f74a7ec254848c8d4d97fca1653885ba31"} err="failed to get container status \"8691f7d7c721a269193a4a6cb58015f74a7ec254848c8d4d97fca1653885ba31\": rpc error: code = NotFound desc = could not find container \"8691f7d7c721a269193a4a6cb58015f74a7ec254848c8d4d97fca1653885ba31\": container with ID starting with 8691f7d7c721a269193a4a6cb58015f74a7ec254848c8d4d97fca1653885ba31 not found: ID does not exist" Mar 01 10:28:13 crc kubenswrapper[4792]: I0301 10:28:13.418961 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12314c82-8a7f-466f-939a-158a53420f72" path="/var/lib/kubelet/pods/12314c82-8a7f-466f-939a-158a53420f72/volumes" Mar 01 10:28:14 crc kubenswrapper[4792]: I0301 10:28:14.408861 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:28:14 crc kubenswrapper[4792]: E0301 10:28:14.409177 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.463282 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k4zrv"] Mar 01 10:28:22 crc kubenswrapper[4792]: E0301 10:28:22.464408 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12314c82-8a7f-466f-939a-158a53420f72" containerName="registry-server" Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.464439 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="12314c82-8a7f-466f-939a-158a53420f72" containerName="registry-server" Mar 01 10:28:22 crc kubenswrapper[4792]: E0301 10:28:22.464451 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5867157f-b16a-460c-afc4-0981a4d8ee43" containerName="oc" Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.464459 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5867157f-b16a-460c-afc4-0981a4d8ee43" containerName="oc" Mar 01 10:28:22 crc kubenswrapper[4792]: E0301 10:28:22.464505 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12314c82-8a7f-466f-939a-158a53420f72" containerName="extract-utilities" Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.464515 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="12314c82-8a7f-466f-939a-158a53420f72" containerName="extract-utilities" Mar 01 10:28:22 crc kubenswrapper[4792]: E0301 10:28:22.464529 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12314c82-8a7f-466f-939a-158a53420f72" containerName="extract-content" Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.464536 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="12314c82-8a7f-466f-939a-158a53420f72" containerName="extract-content" Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.464777 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="5867157f-b16a-460c-afc4-0981a4d8ee43" containerName="oc" Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.464818 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="12314c82-8a7f-466f-939a-158a53420f72" containerName="registry-server" Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.466580 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k4zrv" Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.477367 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k4zrv"] Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.574932 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a74a274-b6f7-421d-917b-33034be46bcf-catalog-content\") pod \"community-operators-k4zrv\" (UID: \"7a74a274-b6f7-421d-917b-33034be46bcf\") " pod="openshift-marketplace/community-operators-k4zrv" Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.575324 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4b95\" (UniqueName: \"kubernetes.io/projected/7a74a274-b6f7-421d-917b-33034be46bcf-kube-api-access-v4b95\") pod \"community-operators-k4zrv\" (UID: \"7a74a274-b6f7-421d-917b-33034be46bcf\") " pod="openshift-marketplace/community-operators-k4zrv" Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.575404 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a74a274-b6f7-421d-917b-33034be46bcf-utilities\") pod \"community-operators-k4zrv\" (UID: \"7a74a274-b6f7-421d-917b-33034be46bcf\") " pod="openshift-marketplace/community-operators-k4zrv" Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.677398 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4b95\" (UniqueName: \"kubernetes.io/projected/7a74a274-b6f7-421d-917b-33034be46bcf-kube-api-access-v4b95\") pod \"community-operators-k4zrv\" (UID: \"7a74a274-b6f7-421d-917b-33034be46bcf\") " pod="openshift-marketplace/community-operators-k4zrv" Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.678080 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a74a274-b6f7-421d-917b-33034be46bcf-utilities\") pod \"community-operators-k4zrv\" (UID: \"7a74a274-b6f7-421d-917b-33034be46bcf\") " pod="openshift-marketplace/community-operators-k4zrv" Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.678131 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a74a274-b6f7-421d-917b-33034be46bcf-utilities\") pod \"community-operators-k4zrv\" (UID: \"7a74a274-b6f7-421d-917b-33034be46bcf\") " pod="openshift-marketplace/community-operators-k4zrv" Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.678338 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a74a274-b6f7-421d-917b-33034be46bcf-catalog-content\") pod \"community-operators-k4zrv\" (UID: \"7a74a274-b6f7-421d-917b-33034be46bcf\") " pod="openshift-marketplace/community-operators-k4zrv" Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.678628 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a74a274-b6f7-421d-917b-33034be46bcf-catalog-content\") pod \"community-operators-k4zrv\" (UID: \"7a74a274-b6f7-421d-917b-33034be46bcf\") " pod="openshift-marketplace/community-operators-k4zrv" Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.712497 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4b95\" (UniqueName: \"kubernetes.io/projected/7a74a274-b6f7-421d-917b-33034be46bcf-kube-api-access-v4b95\") pod \"community-operators-k4zrv\" (UID: \"7a74a274-b6f7-421d-917b-33034be46bcf\") " pod="openshift-marketplace/community-operators-k4zrv" Mar 01 10:28:22 crc kubenswrapper[4792]: I0301 10:28:22.845001 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k4zrv" Mar 01 10:28:23 crc kubenswrapper[4792]: I0301 10:28:23.447812 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k4zrv"] Mar 01 10:28:23 crc kubenswrapper[4792]: W0301 10:28:23.455078 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a74a274_b6f7_421d_917b_33034be46bcf.slice/crio-83306a83f874a192e94ea93f5f35eeb60edae0342b45325c1cc5caf61b810c5d WatchSource:0}: Error finding container 83306a83f874a192e94ea93f5f35eeb60edae0342b45325c1cc5caf61b810c5d: Status 404 returned error can't find the container with id 83306a83f874a192e94ea93f5f35eeb60edae0342b45325c1cc5caf61b810c5d Mar 01 10:28:23 crc kubenswrapper[4792]: I0301 10:28:23.947434 4792 generic.go:334] "Generic (PLEG): container finished" podID="7a74a274-b6f7-421d-917b-33034be46bcf" containerID="3556e01ca51d89d9810182ac44c5e47267a5a99cda21c1fe6ba8f3a99f10147e" exitCode=0 Mar 01 10:28:23 crc kubenswrapper[4792]: I0301 10:28:23.947532 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4zrv" event={"ID":"7a74a274-b6f7-421d-917b-33034be46bcf","Type":"ContainerDied","Data":"3556e01ca51d89d9810182ac44c5e47267a5a99cda21c1fe6ba8f3a99f10147e"} Mar 01 10:28:23 crc kubenswrapper[4792]: I0301 10:28:23.948022 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4zrv" event={"ID":"7a74a274-b6f7-421d-917b-33034be46bcf","Type":"ContainerStarted","Data":"83306a83f874a192e94ea93f5f35eeb60edae0342b45325c1cc5caf61b810c5d"} Mar 01 10:28:24 crc kubenswrapper[4792]: I0301 10:28:24.958210 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4zrv" event={"ID":"7a74a274-b6f7-421d-917b-33034be46bcf","Type":"ContainerStarted","Data":"015d6fed1253f9fed0bc4b9aaca2607218259b4b5bcf7ec7fd11af9d2009ea66"} Mar 01 10:28:25 crc kubenswrapper[4792]: I0301 10:28:25.409859 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:28:25 crc kubenswrapper[4792]: E0301 10:28:25.410363 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:28:26 crc kubenswrapper[4792]: I0301 10:28:26.978961 4792 generic.go:334] "Generic (PLEG): container finished" podID="7a74a274-b6f7-421d-917b-33034be46bcf" containerID="015d6fed1253f9fed0bc4b9aaca2607218259b4b5bcf7ec7fd11af9d2009ea66" exitCode=0 Mar 01 10:28:26 crc kubenswrapper[4792]: I0301 10:28:26.980408 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4zrv" event={"ID":"7a74a274-b6f7-421d-917b-33034be46bcf","Type":"ContainerDied","Data":"015d6fed1253f9fed0bc4b9aaca2607218259b4b5bcf7ec7fd11af9d2009ea66"} Mar 01 10:28:27 crc kubenswrapper[4792]: I0301 10:28:27.991445 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4zrv" event={"ID":"7a74a274-b6f7-421d-917b-33034be46bcf","Type":"ContainerStarted","Data":"059338cee094fc4765fb7b2214647dfeeb6a3603d7b01b2934562f3d164bf7ce"} Mar 01 10:28:28 crc kubenswrapper[4792]: I0301 10:28:28.019297 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k4zrv" podStartSLOduration=2.564903192 podStartE2EDuration="6.019274688s" podCreationTimestamp="2026-03-01 10:28:22 +0000 UTC" firstStartedPulling="2026-03-01 10:28:23.949589209 +0000 UTC m=+4833.191468406" lastFinishedPulling="2026-03-01 10:28:27.403960715 +0000 UTC m=+4836.645839902" observedRunningTime="2026-03-01 10:28:28.010535859 +0000 UTC m=+4837.252415056" watchObservedRunningTime="2026-03-01 10:28:28.019274688 +0000 UTC m=+4837.261153885" Mar 01 10:28:32 crc kubenswrapper[4792]: I0301 10:28:32.845941 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k4zrv" Mar 01 10:28:32 crc kubenswrapper[4792]: I0301 10:28:32.846509 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k4zrv" Mar 01 10:28:32 crc kubenswrapper[4792]: I0301 10:28:32.899932 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k4zrv" Mar 01 10:28:33 crc kubenswrapper[4792]: I0301 10:28:33.075079 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k4zrv" Mar 01 10:28:33 crc kubenswrapper[4792]: I0301 10:28:33.136623 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k4zrv"] Mar 01 10:28:35 crc kubenswrapper[4792]: I0301 10:28:35.041688 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k4zrv" podUID="7a74a274-b6f7-421d-917b-33034be46bcf" containerName="registry-server" containerID="cri-o://059338cee094fc4765fb7b2214647dfeeb6a3603d7b01b2934562f3d164bf7ce" gracePeriod=2 Mar 01 10:28:35 crc kubenswrapper[4792]: I0301 10:28:35.488165 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k4zrv" Mar 01 10:28:35 crc kubenswrapper[4792]: I0301 10:28:35.617088 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4b95\" (UniqueName: \"kubernetes.io/projected/7a74a274-b6f7-421d-917b-33034be46bcf-kube-api-access-v4b95\") pod \"7a74a274-b6f7-421d-917b-33034be46bcf\" (UID: \"7a74a274-b6f7-421d-917b-33034be46bcf\") " Mar 01 10:28:35 crc kubenswrapper[4792]: I0301 10:28:35.617286 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a74a274-b6f7-421d-917b-33034be46bcf-utilities\") pod \"7a74a274-b6f7-421d-917b-33034be46bcf\" (UID: \"7a74a274-b6f7-421d-917b-33034be46bcf\") " Mar 01 10:28:35 crc kubenswrapper[4792]: I0301 10:28:35.617413 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a74a274-b6f7-421d-917b-33034be46bcf-catalog-content\") pod \"7a74a274-b6f7-421d-917b-33034be46bcf\" (UID: \"7a74a274-b6f7-421d-917b-33034be46bcf\") " Mar 01 10:28:35 crc kubenswrapper[4792]: I0301 10:28:35.618248 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a74a274-b6f7-421d-917b-33034be46bcf-utilities" (OuterVolumeSpecName: "utilities") pod "7a74a274-b6f7-421d-917b-33034be46bcf" (UID: "7a74a274-b6f7-421d-917b-33034be46bcf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:28:35 crc kubenswrapper[4792]: I0301 10:28:35.623207 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a74a274-b6f7-421d-917b-33034be46bcf-kube-api-access-v4b95" (OuterVolumeSpecName: "kube-api-access-v4b95") pod "7a74a274-b6f7-421d-917b-33034be46bcf" (UID: "7a74a274-b6f7-421d-917b-33034be46bcf"). InnerVolumeSpecName "kube-api-access-v4b95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:28:35 crc kubenswrapper[4792]: I0301 10:28:35.666958 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a74a274-b6f7-421d-917b-33034be46bcf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a74a274-b6f7-421d-917b-33034be46bcf" (UID: "7a74a274-b6f7-421d-917b-33034be46bcf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:28:35 crc kubenswrapper[4792]: I0301 10:28:35.719559 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a74a274-b6f7-421d-917b-33034be46bcf-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 10:28:35 crc kubenswrapper[4792]: I0301 10:28:35.719598 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a74a274-b6f7-421d-917b-33034be46bcf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 10:28:35 crc kubenswrapper[4792]: I0301 10:28:35.719616 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4b95\" (UniqueName: \"kubernetes.io/projected/7a74a274-b6f7-421d-917b-33034be46bcf-kube-api-access-v4b95\") on node \"crc\" DevicePath \"\"" Mar 01 10:28:36 crc kubenswrapper[4792]: I0301 10:28:36.051377 4792 generic.go:334] "Generic (PLEG): container finished" podID="7a74a274-b6f7-421d-917b-33034be46bcf" containerID="059338cee094fc4765fb7b2214647dfeeb6a3603d7b01b2934562f3d164bf7ce" exitCode=0 Mar 01 10:28:36 crc kubenswrapper[4792]: I0301 10:28:36.051438 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k4zrv" Mar 01 10:28:36 crc kubenswrapper[4792]: I0301 10:28:36.051435 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4zrv" event={"ID":"7a74a274-b6f7-421d-917b-33034be46bcf","Type":"ContainerDied","Data":"059338cee094fc4765fb7b2214647dfeeb6a3603d7b01b2934562f3d164bf7ce"} Mar 01 10:28:36 crc kubenswrapper[4792]: I0301 10:28:36.051498 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4zrv" event={"ID":"7a74a274-b6f7-421d-917b-33034be46bcf","Type":"ContainerDied","Data":"83306a83f874a192e94ea93f5f35eeb60edae0342b45325c1cc5caf61b810c5d"} Mar 01 10:28:36 crc kubenswrapper[4792]: I0301 10:28:36.051517 4792 scope.go:117] "RemoveContainer" containerID="059338cee094fc4765fb7b2214647dfeeb6a3603d7b01b2934562f3d164bf7ce" Mar 01 10:28:36 crc kubenswrapper[4792]: I0301 10:28:36.089943 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k4zrv"] Mar 01 10:28:36 crc kubenswrapper[4792]: I0301 10:28:36.103423 4792 scope.go:117] "RemoveContainer" containerID="015d6fed1253f9fed0bc4b9aaca2607218259b4b5bcf7ec7fd11af9d2009ea66" Mar 01 10:28:36 crc kubenswrapper[4792]: I0301 10:28:36.107331 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k4zrv"] Mar 01 10:28:36 crc kubenswrapper[4792]: I0301 10:28:36.133273 4792 scope.go:117] "RemoveContainer" containerID="3556e01ca51d89d9810182ac44c5e47267a5a99cda21c1fe6ba8f3a99f10147e" Mar 01 10:28:36 crc kubenswrapper[4792]: I0301 10:28:36.173217 4792 scope.go:117] "RemoveContainer" containerID="059338cee094fc4765fb7b2214647dfeeb6a3603d7b01b2934562f3d164bf7ce" Mar 01 10:28:36 crc kubenswrapper[4792]: E0301 10:28:36.173709 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"059338cee094fc4765fb7b2214647dfeeb6a3603d7b01b2934562f3d164bf7ce\": container with ID starting with 059338cee094fc4765fb7b2214647dfeeb6a3603d7b01b2934562f3d164bf7ce not found: ID does not exist" containerID="059338cee094fc4765fb7b2214647dfeeb6a3603d7b01b2934562f3d164bf7ce" Mar 01 10:28:36 crc kubenswrapper[4792]: I0301 10:28:36.173755 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"059338cee094fc4765fb7b2214647dfeeb6a3603d7b01b2934562f3d164bf7ce"} err="failed to get container status \"059338cee094fc4765fb7b2214647dfeeb6a3603d7b01b2934562f3d164bf7ce\": rpc error: code = NotFound desc = could not find container \"059338cee094fc4765fb7b2214647dfeeb6a3603d7b01b2934562f3d164bf7ce\": container with ID starting with 059338cee094fc4765fb7b2214647dfeeb6a3603d7b01b2934562f3d164bf7ce not found: ID does not exist" Mar 01 10:28:36 crc kubenswrapper[4792]: I0301 10:28:36.173789 4792 scope.go:117] "RemoveContainer" containerID="015d6fed1253f9fed0bc4b9aaca2607218259b4b5bcf7ec7fd11af9d2009ea66" Mar 01 10:28:36 crc kubenswrapper[4792]: E0301 10:28:36.174188 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"015d6fed1253f9fed0bc4b9aaca2607218259b4b5bcf7ec7fd11af9d2009ea66\": container with ID starting with 015d6fed1253f9fed0bc4b9aaca2607218259b4b5bcf7ec7fd11af9d2009ea66 not found: ID does not exist" containerID="015d6fed1253f9fed0bc4b9aaca2607218259b4b5bcf7ec7fd11af9d2009ea66" Mar 01 10:28:36 crc kubenswrapper[4792]: I0301 10:28:36.174231 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"015d6fed1253f9fed0bc4b9aaca2607218259b4b5bcf7ec7fd11af9d2009ea66"} err="failed to get container status \"015d6fed1253f9fed0bc4b9aaca2607218259b4b5bcf7ec7fd11af9d2009ea66\": rpc error: code = NotFound desc = could not find container \"015d6fed1253f9fed0bc4b9aaca2607218259b4b5bcf7ec7fd11af9d2009ea66\": container with ID starting with 015d6fed1253f9fed0bc4b9aaca2607218259b4b5bcf7ec7fd11af9d2009ea66 not found: ID does not exist" Mar 01 10:28:36 crc kubenswrapper[4792]: I0301 10:28:36.174263 4792 scope.go:117] "RemoveContainer" containerID="3556e01ca51d89d9810182ac44c5e47267a5a99cda21c1fe6ba8f3a99f10147e" Mar 01 10:28:36 crc kubenswrapper[4792]: E0301 10:28:36.174668 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3556e01ca51d89d9810182ac44c5e47267a5a99cda21c1fe6ba8f3a99f10147e\": container with ID starting with 3556e01ca51d89d9810182ac44c5e47267a5a99cda21c1fe6ba8f3a99f10147e not found: ID does not exist" containerID="3556e01ca51d89d9810182ac44c5e47267a5a99cda21c1fe6ba8f3a99f10147e" Mar 01 10:28:36 crc kubenswrapper[4792]: I0301 10:28:36.174716 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3556e01ca51d89d9810182ac44c5e47267a5a99cda21c1fe6ba8f3a99f10147e"} err="failed to get container status \"3556e01ca51d89d9810182ac44c5e47267a5a99cda21c1fe6ba8f3a99f10147e\": rpc error: code = NotFound desc = could not find container \"3556e01ca51d89d9810182ac44c5e47267a5a99cda21c1fe6ba8f3a99f10147e\": container with ID starting with 3556e01ca51d89d9810182ac44c5e47267a5a99cda21c1fe6ba8f3a99f10147e not found: ID does not exist" Mar 01 10:28:37 crc kubenswrapper[4792]: I0301 10:28:37.409172 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:28:37 crc kubenswrapper[4792]: E0301 10:28:37.409712 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:28:37 crc kubenswrapper[4792]: I0301 10:28:37.419038 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a74a274-b6f7-421d-917b-33034be46bcf" path="/var/lib/kubelet/pods/7a74a274-b6f7-421d-917b-33034be46bcf/volumes" Mar 01 10:28:40 crc kubenswrapper[4792]: I0301 10:28:40.303965 4792 scope.go:117] "RemoveContainer" containerID="ec9227dfee7817fbe5632f2b037665aa0557b7bf6988989846a2502b4704a463" Mar 01 10:28:49 crc kubenswrapper[4792]: I0301 10:28:49.685788 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7f86869f48-jg6nw_6c97472a-b6b7-4fc4-b872-a318812f0999/barbican-api/0.log" Mar 01 10:28:49 crc kubenswrapper[4792]: I0301 10:28:49.916015 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-64f98fd86b-96l6n_d30c642c-b4ae-495a-8acd-cc8be4a0f412/barbican-keystone-listener/0.log" Mar 01 10:28:49 crc kubenswrapper[4792]: I0301 10:28:49.940887 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7f86869f48-jg6nw_6c97472a-b6b7-4fc4-b872-a318812f0999/barbican-api-log/0.log" Mar 01 10:28:49 crc kubenswrapper[4792]: I0301 10:28:49.986730 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-64f98fd86b-96l6n_d30c642c-b4ae-495a-8acd-cc8be4a0f412/barbican-keystone-listener-log/0.log" Mar 01 10:28:50 crc kubenswrapper[4792]: I0301 10:28:50.250577 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-65f4d58895-tvn59_26fbd30a-a485-4463-9aac-bb695c43e9e3/barbican-worker/0.log" Mar 01 10:28:50 crc kubenswrapper[4792]: I0301 10:28:50.278511 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-65f4d58895-tvn59_26fbd30a-a485-4463-9aac-bb695c43e9e3/barbican-worker-log/0.log" Mar 01 10:28:50 crc kubenswrapper[4792]: I0301 10:28:50.538621 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-nscjb_1201ca91-41eb-45d0-991d-71883b4014ae/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:28:50 crc kubenswrapper[4792]: I0301 10:28:50.668423 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_63238274-bc2e-4686-8371-e891944269f9/ceilometer-notification-agent/0.log" Mar 01 10:28:50 crc kubenswrapper[4792]: I0301 10:28:50.682494 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_63238274-bc2e-4686-8371-e891944269f9/ceilometer-central-agent/0.log" Mar 01 10:28:50 crc kubenswrapper[4792]: I0301 10:28:50.747168 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_63238274-bc2e-4686-8371-e891944269f9/proxy-httpd/0.log" Mar 01 10:28:50 crc kubenswrapper[4792]: I0301 10:28:50.873499 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_63238274-bc2e-4686-8371-e891944269f9/sg-core/0.log" Mar 01 10:28:50 crc kubenswrapper[4792]: I0301 10:28:50.958082 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-wpgqj_f3a428e9-b35d-4f80-bb40-c158095d5bfa/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:28:51 crc kubenswrapper[4792]: I0301 10:28:51.159076 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-m485m_2c8e3a3c-dcb7-44da-9f38-1a3d6a9b5d37/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:28:51 crc kubenswrapper[4792]: I0301 10:28:51.295210 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_084f9db1-15eb-458c-8b43-aeb5dbb0555f/cinder-api/0.log" Mar 01 10:28:51 crc kubenswrapper[4792]: I0301 10:28:51.306189 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_084f9db1-15eb-458c-8b43-aeb5dbb0555f/cinder-api-log/0.log" Mar 01 10:28:51 crc kubenswrapper[4792]: I0301 10:28:51.649076 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_23d15722-3d0f-44ce-ac55-eba67760f0e9/probe/0.log" Mar 01 10:28:51 crc kubenswrapper[4792]: I0301 10:28:51.742223 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_23d15722-3d0f-44ce-ac55-eba67760f0e9/cinder-backup/0.log" Mar 01 10:28:51 crc kubenswrapper[4792]: I0301 10:28:51.829336 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_688f590f-ae5c-4caf-b8c7-013a118f42c5/cinder-scheduler/0.log" Mar 01 10:28:51 crc kubenswrapper[4792]: I0301 10:28:51.999410 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_688f590f-ae5c-4caf-b8c7-013a118f42c5/probe/0.log" Mar 01 10:28:52 crc kubenswrapper[4792]: I0301 10:28:52.120244 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_d3ca4743-fa6c-4e2e-b2c8-b2362f44a727/cinder-volume/0.log" Mar 01 10:28:52 crc kubenswrapper[4792]: I0301 10:28:52.140592 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_d3ca4743-fa6c-4e2e-b2c8-b2362f44a727/probe/0.log" Mar 01 10:28:52 crc kubenswrapper[4792]: I0301 10:28:52.408972 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:28:52 crc kubenswrapper[4792]: E0301 10:28:52.409199 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:28:52 crc kubenswrapper[4792]: I0301 10:28:52.567655 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-dtgks_f25228f4-912f-408c-a1d6-9279c350b767/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:28:52 crc kubenswrapper[4792]: I0301 10:28:52.772705 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-ldrr5_cd1ec6ee-7caa-4ec6-91a8-dab399f5d0d0/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:28:52 crc kubenswrapper[4792]: I0301 10:28:52.886700 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-86c6bdcc4c-fqgkv_49541358-1fd0-4d1d-8b61-0c618994dfc0/init/0.log" Mar 01 10:28:53 crc kubenswrapper[4792]: I0301 10:28:53.056556 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-86c6bdcc4c-fqgkv_49541358-1fd0-4d1d-8b61-0c618994dfc0/init/0.log" Mar 01 10:28:53 crc kubenswrapper[4792]: I0301 10:28:53.200830 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_52b189da-3327-40c1-bf22-a842b0980593/glance-httpd/0.log" Mar 01 10:28:53 crc kubenswrapper[4792]: I0301 10:28:53.266778 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-86c6bdcc4c-fqgkv_49541358-1fd0-4d1d-8b61-0c618994dfc0/dnsmasq-dns/0.log" Mar 01 10:28:53 crc kubenswrapper[4792]: I0301 10:28:53.292109 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_52b189da-3327-40c1-bf22-a842b0980593/glance-log/0.log" Mar 01 10:28:53 crc kubenswrapper[4792]: I0301 10:28:53.492505 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_9d055103-6c35-481f-820a-7aa363543404/glance-httpd/0.log" Mar 01 10:28:53 crc kubenswrapper[4792]: I0301 10:28:53.508261 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_9d055103-6c35-481f-820a-7aa363543404/glance-log/0.log" Mar 01 10:28:53 crc kubenswrapper[4792]: I0301 10:28:53.909461 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-79f8cb6d9d-xg7h5_d7f79f77-ac1b-445e-8e28-85c8964f5461/horizon/0.log" Mar 01 10:28:53 crc kubenswrapper[4792]: I0301 10:28:53.985893 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-xqprh_d11c64e6-0562-41d9-a213-f1c5749b4c83/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:28:54 crc kubenswrapper[4792]: I0301 10:28:54.025640 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-79f8cb6d9d-xg7h5_d7f79f77-ac1b-445e-8e28-85c8964f5461/horizon-log/0.log" Mar 01 10:28:54 crc kubenswrapper[4792]: I0301 10:28:54.235496 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-4rw28_822af429-9091-43e5-a16d-7a287f2c5bb2/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:28:54 crc kubenswrapper[4792]: I0301 10:28:54.417264 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-749f685d77-ggsln_b60e7776-3e2a-4e08-900d-cd39a29a78bc/keystone-api/0.log" Mar 01 10:28:54 crc kubenswrapper[4792]: I0301 10:28:54.573963 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29539321-sclgm_7ec04609-b280-4df0-a0c5-2e4c7208c1c6/keystone-cron/0.log" Mar 01 10:28:54 crc kubenswrapper[4792]: I0301 10:28:54.736142 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_6f21d62f-3539-4d5d-aeaa-cc816a51d412/kube-state-metrics/0.log" Mar 01 10:28:54 crc kubenswrapper[4792]: I0301 10:28:54.857809 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-9cpgt_c7230f65-7e9a-4455-8d25-c49393bfbafe/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:28:54 crc kubenswrapper[4792]: I0301 10:28:54.942644 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_e6660fdc-5636-44ec-b6c0-e0e417d72e8a/manila-api-log/0.log" Mar 01 10:28:55 crc kubenswrapper[4792]: I0301 10:28:55.100040 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_e6660fdc-5636-44ec-b6c0-e0e417d72e8a/manila-api/0.log" Mar 01 10:28:55 crc kubenswrapper[4792]: I0301 10:28:55.269209 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_5813cf9a-1d9e-4a74-82e1-68e994c9175a/probe/0.log" Mar 01 10:28:55 crc kubenswrapper[4792]: I0301 10:28:55.275189 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_5813cf9a-1d9e-4a74-82e1-68e994c9175a/manila-scheduler/0.log" Mar 01 10:28:55 crc kubenswrapper[4792]: I0301 10:28:55.433443 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_03462f2f-874f-496a-934b-9fa6e2c55850/manila-share/0.log" Mar 01 10:28:55 crc kubenswrapper[4792]: I0301 10:28:55.449503 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_03462f2f-874f-496a-934b-9fa6e2c55850/probe/0.log" Mar 01 10:28:55 crc kubenswrapper[4792]: I0301 10:28:55.835526 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6c8bdfb955-kjg92_ceced30a-39e5-413f-a498-e5d4500f1eea/neutron-httpd/0.log" Mar 01 10:28:55 crc kubenswrapper[4792]: I0301 10:28:55.850639 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6c8bdfb955-kjg92_ceced30a-39e5-413f-a498-e5d4500f1eea/neutron-api/0.log" Mar 01 10:28:56 crc kubenswrapper[4792]: I0301 10:28:56.473436 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-jlszt_f737af00-5e6f-4a95-bf94-738b72990ebd/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:28:56 crc kubenswrapper[4792]: I0301 10:28:56.993750 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9c6de822-b7f5-4530-bb5b-ca879ff899fc/nova-api-log/0.log" Mar 01 10:28:57 crc kubenswrapper[4792]: I0301 10:28:57.077646 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_f95aafcd-79b6-4ece-b3e1-ee9ea32a2754/nova-cell0-conductor-conductor/0.log" Mar 01 10:28:57 crc kubenswrapper[4792]: I0301 10:28:57.461894 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9c6de822-b7f5-4530-bb5b-ca879ff899fc/nova-api-api/0.log" Mar 01 10:28:57 crc kubenswrapper[4792]: I0301 10:28:57.485213 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_9ef6cc4e-2fd6-403b-a163-638395c30672/nova-cell1-conductor-conductor/0.log" Mar 01 10:28:57 crc kubenswrapper[4792]: I0301 10:28:57.579290 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_63afaac7-c934-4410-b2b5-ab04ad085489/nova-cell1-novncproxy-novncproxy/0.log" Mar 01 10:28:57 crc kubenswrapper[4792]: I0301 10:28:57.799148 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-cspq7_d7776778-c586-4ab6-8fdf-bfed4168992d/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:28:57 crc kubenswrapper[4792]: I0301 10:28:57.968546 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_cbf9560f-212f-460a-9a4d-250e20b00d18/nova-metadata-log/0.log" Mar 01 10:28:58 crc kubenswrapper[4792]: I0301 10:28:58.322247 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_3a38c1a1-88bc-4bce-aea4-13e676aab111/nova-scheduler-scheduler/0.log" Mar 01 10:28:58 crc kubenswrapper[4792]: I0301 10:28:58.430335 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f2d03d42-7830-444b-a8ae-c91e16d352b9/mysql-bootstrap/0.log" Mar 01 10:28:58 crc kubenswrapper[4792]: I0301 10:28:58.692478 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f2d03d42-7830-444b-a8ae-c91e16d352b9/mysql-bootstrap/0.log" Mar 01 10:28:58 crc kubenswrapper[4792]: I0301 10:28:58.711112 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f2d03d42-7830-444b-a8ae-c91e16d352b9/galera/0.log" Mar 01 10:28:58 crc kubenswrapper[4792]: I0301 10:28:58.947398 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b969e6eb-14a7-4e45-8342-ccbd05c06261/mysql-bootstrap/0.log" Mar 01 10:28:59 crc kubenswrapper[4792]: I0301 10:28:59.271484 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b969e6eb-14a7-4e45-8342-ccbd05c06261/galera/0.log" Mar 01 10:28:59 crc kubenswrapper[4792]: I0301 10:28:59.338380 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b969e6eb-14a7-4e45-8342-ccbd05c06261/mysql-bootstrap/0.log" Mar 01 10:28:59 crc kubenswrapper[4792]: I0301 10:28:59.606342 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_fecafda6-dcf9-46ea-8678-8da499154ad7/openstackclient/0.log" Mar 01 10:28:59 crc kubenswrapper[4792]: I0301 10:28:59.684387 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_cbf9560f-212f-460a-9a4d-250e20b00d18/nova-metadata-metadata/0.log" Mar 01 10:28:59 crc kubenswrapper[4792]: I0301 10:28:59.749417 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-7wc55_9493aff0-58e3-44ca-ba01-69f3b284d732/openstack-network-exporter/0.log" Mar 01 10:28:59 crc kubenswrapper[4792]: I0301 10:28:59.970764 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-mpvqc_d50ee3b1-4f97-4644-802d-04c85d9c3abc/ovn-controller/0.log" Mar 01 10:29:00 crc kubenswrapper[4792]: I0301 10:29:00.617123 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-nfzrr_22d78adc-2ff6-4f03-b60e-ac8e9a0f3699/ovsdb-server-init/0.log" Mar 01 10:29:00 crc kubenswrapper[4792]: I0301 10:29:00.848777 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-nfzrr_22d78adc-2ff6-4f03-b60e-ac8e9a0f3699/ovsdb-server-init/0.log" Mar 01 10:29:00 crc kubenswrapper[4792]: I0301 10:29:00.889887 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-nfzrr_22d78adc-2ff6-4f03-b60e-ac8e9a0f3699/ovs-vswitchd/0.log" Mar 01 10:29:01 crc kubenswrapper[4792]: I0301 10:29:01.032451 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-nfzrr_22d78adc-2ff6-4f03-b60e-ac8e9a0f3699/ovsdb-server/0.log" Mar 01 10:29:01 crc kubenswrapper[4792]: I0301 10:29:01.286576 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-bc5rl_e4b8a64b-6bea-426c-b1f5-2372342d4211/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:29:01 crc kubenswrapper[4792]: I0301 10:29:01.301056 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_1712e112-23fd-402b-ae0b-f63a594d4fab/openstack-network-exporter/0.log" Mar 01 10:29:01 crc kubenswrapper[4792]: I0301 10:29:01.393510 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_1712e112-23fd-402b-ae0b-f63a594d4fab/ovn-northd/0.log" Mar 01 10:29:01 crc kubenswrapper[4792]: I0301 10:29:01.587693 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_2c9312b5-705e-42f0-8462-62c8fdeb0791/openstack-network-exporter/0.log" Mar 01 10:29:01 crc kubenswrapper[4792]: I0301 10:29:01.681957 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_2c9312b5-705e-42f0-8462-62c8fdeb0791/ovsdbserver-nb/0.log" Mar 01 10:29:01 crc kubenswrapper[4792]: I0301 10:29:01.853170 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_a20f7417-3c04-411a-88b9-d60664faaee3/openstack-network-exporter/0.log" Mar 01 10:29:01 crc kubenswrapper[4792]: I0301 10:29:01.942179 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_a20f7417-3c04-411a-88b9-d60664faaee3/ovsdbserver-sb/0.log" Mar 01 10:29:02 crc kubenswrapper[4792]: I0301 10:29:02.118687 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-84f9696594-qdwsv_18f9e703-dec0-46e1-a428-580bdb68e54e/placement-api/0.log" Mar 01 10:29:02 crc kubenswrapper[4792]: I0301 10:29:02.202554 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-84f9696594-qdwsv_18f9e703-dec0-46e1-a428-580bdb68e54e/placement-log/0.log" Mar 01 10:29:02 crc kubenswrapper[4792]: I0301 10:29:02.331989 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e0e1dd7a-6a53-446d-bf90-5813f7a3fda0/setup-container/0.log" Mar 01 10:29:02 crc kubenswrapper[4792]: I0301 10:29:02.591706 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_63658b27-63d9-4a0f-afca-3a3c245b9b9d/setup-container/0.log" Mar 01 10:29:02 crc kubenswrapper[4792]: I0301 10:29:02.601114 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e0e1dd7a-6a53-446d-bf90-5813f7a3fda0/setup-container/0.log" Mar 01 10:29:02 crc kubenswrapper[4792]: I0301 10:29:02.636522 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e0e1dd7a-6a53-446d-bf90-5813f7a3fda0/rabbitmq/0.log" Mar 01 10:29:03 crc kubenswrapper[4792]: I0301 10:29:03.345477 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_63658b27-63d9-4a0f-afca-3a3c245b9b9d/setup-container/0.log" Mar 01 10:29:03 crc kubenswrapper[4792]: I0301 10:29:03.394673 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-gqh9d_34275228-a1ab-4955-9d16-d184643a86d1/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:29:03 crc kubenswrapper[4792]: I0301 10:29:03.426863 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_63658b27-63d9-4a0f-afca-3a3c245b9b9d/rabbitmq/0.log" Mar 01 10:29:03 crc kubenswrapper[4792]: I0301 10:29:03.747734 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-9gc64_6c517000-6918-4f58-871b-7c4d26197ccf/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:29:03 crc kubenswrapper[4792]: I0301 10:29:03.793360 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-gxxr7_ff733b23-0a97-4623-9eeb-339aa02fc3b0/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:29:04 crc kubenswrapper[4792]: I0301 10:29:04.010185 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-8k5rj_ac58ff00-ba74-492a-97f1-e72c56686f1d/ssh-known-hosts-edpm-deployment/0.log" Mar 01 10:29:04 crc kubenswrapper[4792]: I0301 10:29:04.451525 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_ee1c75ce-61f7-4ce5-a757-b7405d7135bd/tempest-tests-tempest-tests-runner/0.log" Mar 01 10:29:04 crc kubenswrapper[4792]: I0301 10:29:04.617725 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_478d8531-4e8e-4775-999d-42af4afef106/test-operator-logs-container/0.log" Mar 01 10:29:05 crc kubenswrapper[4792]: I0301 10:29:05.264420 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-phn2l_59b987d8-9463-48cb-9651-1e5cb16aa764/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 01 10:29:07 crc kubenswrapper[4792]: I0301 10:29:07.408866 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:29:07 crc kubenswrapper[4792]: E0301 10:29:07.410096 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:29:16 crc kubenswrapper[4792]: I0301 10:29:16.914544 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_84d455ad-7bbb-4771-a8ed-9aa1984e1d40/memcached/0.log" Mar 01 10:29:22 crc kubenswrapper[4792]: I0301 10:29:22.409389 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:29:22 crc kubenswrapper[4792]: E0301 10:29:22.410193 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:29:37 crc kubenswrapper[4792]: I0301 10:29:37.409384 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:29:37 crc kubenswrapper[4792]: E0301 10:29:37.410336 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:29:38 crc kubenswrapper[4792]: I0301 10:29:38.155310 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2_d4447fd9-d2df-47f4-a94f-ff8b4c5080bd/util/0.log" Mar 01 10:29:38 crc kubenswrapper[4792]: I0301 10:29:38.397288 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2_d4447fd9-d2df-47f4-a94f-ff8b4c5080bd/util/0.log" Mar 01 10:29:38 crc kubenswrapper[4792]: I0301 10:29:38.443485 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2_d4447fd9-d2df-47f4-a94f-ff8b4c5080bd/pull/0.log" Mar 01 10:29:38 crc kubenswrapper[4792]: I0301 10:29:38.456301 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2_d4447fd9-d2df-47f4-a94f-ff8b4c5080bd/pull/0.log" Mar 01 10:29:38 crc kubenswrapper[4792]: I0301 10:29:38.596420 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2_d4447fd9-d2df-47f4-a94f-ff8b4c5080bd/pull/0.log" Mar 01 10:29:38 crc kubenswrapper[4792]: I0301 10:29:38.642153 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2_d4447fd9-d2df-47f4-a94f-ff8b4c5080bd/util/0.log" Mar 01 10:29:38 crc kubenswrapper[4792]: I0301 10:29:38.682552 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_905e9a425c33e47d6204269039a1e357ac8e7863c72f041cbfc2c79677b6jl2_d4447fd9-d2df-47f4-a94f-ff8b4c5080bd/extract/0.log" Mar 01 10:29:39 crc kubenswrapper[4792]: I0301 10:29:39.355720 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-5d87c9d997-72srw_bf1f37ea-a566-4dfd-b45b-02f284f19ce3/manager/0.log" Mar 01 10:29:39 crc kubenswrapper[4792]: I0301 10:29:39.678871 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-64db6967f8-9wzbh_02dd5cc0-c44b-4ede-972b-9d26c9c54100/manager/0.log" Mar 01 10:29:39 crc kubenswrapper[4792]: I0301 10:29:39.958084 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-cf99c678f-7v65r_5044cf86-f557-41d4-b6c0-a41a668ac999/manager/0.log" Mar 01 10:29:40 crc kubenswrapper[4792]: I0301 10:29:40.277785 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-78bc7f9bd9-55qzx_cd83ed19-023d-43c2-92db-d290499db3d4/manager/0.log" Mar 01 10:29:40 crc kubenswrapper[4792]: I0301 10:29:40.880570 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-545456dc4-jvw5j_2af4993f-9ba9-4f7a-a31e-2bd133a7d4c5/manager/0.log" Mar 01 10:29:41 crc kubenswrapper[4792]: I0301 10:29:41.160325 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-f7fcc58b9-dsqtf_ea6739c2-185a-43e7-8fcf-0b2ae31957a0/manager/0.log" Mar 01 10:29:41 crc kubenswrapper[4792]: I0301 10:29:41.339109 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-jlnsb_8741a141-0194-4eb2-956e-c41f4ffe1338/manager/0.log" Mar 01 10:29:41 crc kubenswrapper[4792]: I0301 10:29:41.553225 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7c789f89c6-wjf62_234d2ae5-7589-44cc-83f4-b0ee8a91940a/manager/0.log" Mar 01 10:29:41 crc kubenswrapper[4792]: I0301 10:29:41.660010 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-t5fsn_376afe52-646d-44b7-b32e-ce6cd6dc21a6/manager/0.log" Mar 01 10:29:42 crc kubenswrapper[4792]: I0301 10:29:42.495175 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-7b6bfb6475-hlzm6_1793465e-1273-4250-a238-c99798788618/manager/0.log" Mar 01 10:29:42 crc kubenswrapper[4792]: I0301 10:29:42.526404 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54688575f-qjqd2_dfb10d33-c4f1-4287-be83-dff835c733ba/manager/0.log" Mar 01 10:29:42 crc kubenswrapper[4792]: I0301 10:29:42.888898 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-74b6b5dc96-knk7m_8307ba19-5fc4-4cfc-b3cd-cafe5eac9cb9/manager/0.log" Mar 01 10:29:42 crc kubenswrapper[4792]: I0301 10:29:42.906122 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5d86c7ddb7-54rpl_ecc17c18-7695-4d22-9a95-bcac51800d60/manager/0.log" Mar 01 10:29:43 crc kubenswrapper[4792]: I0301 10:29:43.635336 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7b4cc4776948grv_9244686e-175e-45f9-9eb7-23621cd1f3cd/manager/0.log" Mar 01 10:29:43 crc kubenswrapper[4792]: I0301 10:29:43.928597 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-595c94944c-vtchh_c967e6f5-6388-4ae5-9ccf-379b6305e1b0/operator/0.log" Mar 01 10:29:44 crc kubenswrapper[4792]: I0301 10:29:44.298343 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-5kfk4_dc22117a-72a7-4838-bb1c-111e91514b98/registry-server/0.log" Mar 01 10:29:44 crc kubenswrapper[4792]: I0301 10:29:44.375201 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-75684d597f-zkx7c_3d38195c-e4ff-49cf-9592-e9f52d73f2df/manager/0.log" Mar 01 10:29:44 crc kubenswrapper[4792]: I0301 10:29:44.556746 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-648564c9fc-jdn6k_808b8753-0a20-419b-8b04-dcbccaa2d77e/manager/0.log" Mar 01 10:29:44 crc kubenswrapper[4792]: I0301 10:29:44.700555 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-r5l9m_1ecd6b07-eda9-41d6-90af-6471699ff808/operator/0.log" Mar 01 10:29:44 crc kubenswrapper[4792]: I0301 10:29:44.958496 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9b9ff9f4d-mqndr_e0cef8e2-a392-4612-97c6-17c611b2a44e/manager/0.log" Mar 01 10:29:45 crc kubenswrapper[4792]: I0301 10:29:45.230279 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5fdb694969-jpxwz_4fe8270e-a46d-40bc-8d24-a4585b196f5e/manager/0.log" Mar 01 10:29:45 crc kubenswrapper[4792]: I0301 10:29:45.300765 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-55b5ff4dbb-bcnns_2970c60c-7b03-4667-99e4-08c094cdbfc2/manager/0.log" Mar 01 10:29:45 crc kubenswrapper[4792]: I0301 10:29:45.503252 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-64lkf_e45ebab9-87d5-4b2f-b3d1-f1832864584d/manager/0.log" Mar 01 10:29:45 crc kubenswrapper[4792]: I0301 10:29:45.952727 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-864b865b94-5ndlx_d1d3783f-78e9-461a-916a-5a46e3083e70/manager/0.log" Mar 01 10:29:49 crc kubenswrapper[4792]: I0301 10:29:49.411316 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:29:49 crc kubenswrapper[4792]: E0301 10:29:49.411832 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:29:50 crc kubenswrapper[4792]: I0301 10:29:50.694771 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6db6876945-ggspg_b9e3fd6b-e3e2-4380-b8d7-900891df562a/manager/0.log" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.154843 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539350-vrwj8"] Mar 01 10:30:00 crc kubenswrapper[4792]: E0301 10:30:00.155590 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a74a274-b6f7-421d-917b-33034be46bcf" containerName="extract-content" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.155620 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a74a274-b6f7-421d-917b-33034be46bcf" containerName="extract-content" Mar 01 10:30:00 crc kubenswrapper[4792]: E0301 10:30:00.155638 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a74a274-b6f7-421d-917b-33034be46bcf" containerName="extract-utilities" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.155644 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a74a274-b6f7-421d-917b-33034be46bcf" containerName="extract-utilities" Mar 01 10:30:00 crc kubenswrapper[4792]: E0301 10:30:00.155661 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a74a274-b6f7-421d-917b-33034be46bcf" containerName="registry-server" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.155668 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a74a274-b6f7-421d-917b-33034be46bcf" containerName="registry-server" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.155859 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a74a274-b6f7-421d-917b-33034be46bcf" containerName="registry-server" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.156665 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539350-vrwj8" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.159682 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.159726 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.160343 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.178545 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz"] Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.179702 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.183402 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.183774 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.195660 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539350-vrwj8"] Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.212868 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz"] Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.293722 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf24ca39-d196-4cdd-8521-da51a4f51649-secret-volume\") pod \"collect-profiles-29539350-hqxcz\" (UID: \"cf24ca39-d196-4cdd-8521-da51a4f51649\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.294100 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fdq4\" (UniqueName: \"kubernetes.io/projected/f461ce0a-d106-4086-a698-987b95f5f03e-kube-api-access-4fdq4\") pod \"auto-csr-approver-29539350-vrwj8\" (UID: \"f461ce0a-d106-4086-a698-987b95f5f03e\") " pod="openshift-infra/auto-csr-approver-29539350-vrwj8" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.294252 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf24ca39-d196-4cdd-8521-da51a4f51649-config-volume\") pod \"collect-profiles-29539350-hqxcz\" (UID: \"cf24ca39-d196-4cdd-8521-da51a4f51649\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.294415 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf4pb\" (UniqueName: \"kubernetes.io/projected/cf24ca39-d196-4cdd-8521-da51a4f51649-kube-api-access-gf4pb\") pod \"collect-profiles-29539350-hqxcz\" (UID: \"cf24ca39-d196-4cdd-8521-da51a4f51649\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.395801 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf24ca39-d196-4cdd-8521-da51a4f51649-secret-volume\") pod \"collect-profiles-29539350-hqxcz\" (UID: \"cf24ca39-d196-4cdd-8521-da51a4f51649\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.395900 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fdq4\" (UniqueName: \"kubernetes.io/projected/f461ce0a-d106-4086-a698-987b95f5f03e-kube-api-access-4fdq4\") pod \"auto-csr-approver-29539350-vrwj8\" (UID: \"f461ce0a-d106-4086-a698-987b95f5f03e\") " pod="openshift-infra/auto-csr-approver-29539350-vrwj8" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.395978 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf24ca39-d196-4cdd-8521-da51a4f51649-config-volume\") pod \"collect-profiles-29539350-hqxcz\" (UID: \"cf24ca39-d196-4cdd-8521-da51a4f51649\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.396050 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf4pb\" (UniqueName: \"kubernetes.io/projected/cf24ca39-d196-4cdd-8521-da51a4f51649-kube-api-access-gf4pb\") pod \"collect-profiles-29539350-hqxcz\" (UID: \"cf24ca39-d196-4cdd-8521-da51a4f51649\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.397514 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf24ca39-d196-4cdd-8521-da51a4f51649-config-volume\") pod \"collect-profiles-29539350-hqxcz\" (UID: \"cf24ca39-d196-4cdd-8521-da51a4f51649\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.563713 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf4pb\" (UniqueName: \"kubernetes.io/projected/cf24ca39-d196-4cdd-8521-da51a4f51649-kube-api-access-gf4pb\") pod \"collect-profiles-29539350-hqxcz\" (UID: \"cf24ca39-d196-4cdd-8521-da51a4f51649\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.563831 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf24ca39-d196-4cdd-8521-da51a4f51649-secret-volume\") pod \"collect-profiles-29539350-hqxcz\" (UID: \"cf24ca39-d196-4cdd-8521-da51a4f51649\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.564045 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fdq4\" (UniqueName: \"kubernetes.io/projected/f461ce0a-d106-4086-a698-987b95f5f03e-kube-api-access-4fdq4\") pod \"auto-csr-approver-29539350-vrwj8\" (UID: \"f461ce0a-d106-4086-a698-987b95f5f03e\") " pod="openshift-infra/auto-csr-approver-29539350-vrwj8" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.785208 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539350-vrwj8" Mar 01 10:30:00 crc kubenswrapper[4792]: I0301 10:30:00.804654 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz" Mar 01 10:30:01 crc kubenswrapper[4792]: I0301 10:30:01.356992 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539350-vrwj8"] Mar 01 10:30:01 crc kubenswrapper[4792]: I0301 10:30:01.368823 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz"] Mar 01 10:30:01 crc kubenswrapper[4792]: I0301 10:30:01.416261 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:30:01 crc kubenswrapper[4792]: E0301 10:30:01.416807 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:30:01 crc kubenswrapper[4792]: I0301 10:30:01.924736 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539350-vrwj8" event={"ID":"f461ce0a-d106-4086-a698-987b95f5f03e","Type":"ContainerStarted","Data":"10ba251a37312235f0dee9803ff2e110d026416ee1d83aa9693bad44cb7d068e"} Mar 01 10:30:01 crc kubenswrapper[4792]: I0301 10:30:01.936067 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz" event={"ID":"cf24ca39-d196-4cdd-8521-da51a4f51649","Type":"ContainerStarted","Data":"f553d8ce8b9a885e297acd39ab9caadfb2d86a33b2a8d2eb99d093eaf06f4e9f"} Mar 01 10:30:01 crc kubenswrapper[4792]: I0301 10:30:01.936120 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz" event={"ID":"cf24ca39-d196-4cdd-8521-da51a4f51649","Type":"ContainerStarted","Data":"f62065739fd8378d875f429e28e19ba02569d3da54f6ca4c619f6d141a3465fe"} Mar 01 10:30:01 crc kubenswrapper[4792]: I0301 10:30:01.961254 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz" podStartSLOduration=1.9612374940000001 podStartE2EDuration="1.961237494s" podCreationTimestamp="2026-03-01 10:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-01 10:30:01.959025329 +0000 UTC m=+4931.200904546" watchObservedRunningTime="2026-03-01 10:30:01.961237494 +0000 UTC m=+4931.203116691" Mar 01 10:30:02 crc kubenswrapper[4792]: E0301 10:30:02.242164 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf24ca39_d196_4cdd_8521_da51a4f51649.slice/crio-conmon-f553d8ce8b9a885e297acd39ab9caadfb2d86a33b2a8d2eb99d093eaf06f4e9f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf24ca39_d196_4cdd_8521_da51a4f51649.slice/crio-f553d8ce8b9a885e297acd39ab9caadfb2d86a33b2a8d2eb99d093eaf06f4e9f.scope\": RecentStats: unable to find data in memory cache]" Mar 01 10:30:02 crc kubenswrapper[4792]: I0301 10:30:02.945402 4792 generic.go:334] "Generic (PLEG): container finished" podID="cf24ca39-d196-4cdd-8521-da51a4f51649" containerID="f553d8ce8b9a885e297acd39ab9caadfb2d86a33b2a8d2eb99d093eaf06f4e9f" exitCode=0 Mar 01 10:30:02 crc kubenswrapper[4792]: I0301 10:30:02.945496 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz" event={"ID":"cf24ca39-d196-4cdd-8521-da51a4f51649","Type":"ContainerDied","Data":"f553d8ce8b9a885e297acd39ab9caadfb2d86a33b2a8d2eb99d093eaf06f4e9f"} Mar 01 10:30:03 crc kubenswrapper[4792]: I0301 10:30:03.954040 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539350-vrwj8" event={"ID":"f461ce0a-d106-4086-a698-987b95f5f03e","Type":"ContainerStarted","Data":"72dbc4f60f34ad4150df8188c6c0da347f12c8f2e84aa4c765ae93732fb406a9"} Mar 01 10:30:04 crc kubenswrapper[4792]: I0301 10:30:04.326441 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz" Mar 01 10:30:04 crc kubenswrapper[4792]: I0301 10:30:04.345384 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29539350-vrwj8" podStartSLOduration=2.265024954 podStartE2EDuration="4.345364456s" podCreationTimestamp="2026-03-01 10:30:00 +0000 UTC" firstStartedPulling="2026-03-01 10:30:01.384482646 +0000 UTC m=+4930.626361843" lastFinishedPulling="2026-03-01 10:30:03.464822148 +0000 UTC m=+4932.706701345" observedRunningTime="2026-03-01 10:30:03.970934389 +0000 UTC m=+4933.212813576" watchObservedRunningTime="2026-03-01 10:30:04.345364456 +0000 UTC m=+4933.587243653" Mar 01 10:30:04 crc kubenswrapper[4792]: I0301 10:30:04.385477 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf4pb\" (UniqueName: \"kubernetes.io/projected/cf24ca39-d196-4cdd-8521-da51a4f51649-kube-api-access-gf4pb\") pod \"cf24ca39-d196-4cdd-8521-da51a4f51649\" (UID: \"cf24ca39-d196-4cdd-8521-da51a4f51649\") " Mar 01 10:30:04 crc kubenswrapper[4792]: I0301 10:30:04.385694 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf24ca39-d196-4cdd-8521-da51a4f51649-secret-volume\") pod \"cf24ca39-d196-4cdd-8521-da51a4f51649\" (UID: \"cf24ca39-d196-4cdd-8521-da51a4f51649\") " Mar 01 10:30:04 crc kubenswrapper[4792]: I0301 10:30:04.385720 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf24ca39-d196-4cdd-8521-da51a4f51649-config-volume\") pod \"cf24ca39-d196-4cdd-8521-da51a4f51649\" (UID: \"cf24ca39-d196-4cdd-8521-da51a4f51649\") " Mar 01 10:30:04 crc kubenswrapper[4792]: I0301 10:30:04.390097 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf24ca39-d196-4cdd-8521-da51a4f51649-config-volume" (OuterVolumeSpecName: "config-volume") pod "cf24ca39-d196-4cdd-8521-da51a4f51649" (UID: "cf24ca39-d196-4cdd-8521-da51a4f51649"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 01 10:30:04 crc kubenswrapper[4792]: I0301 10:30:04.398835 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf24ca39-d196-4cdd-8521-da51a4f51649-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cf24ca39-d196-4cdd-8521-da51a4f51649" (UID: "cf24ca39-d196-4cdd-8521-da51a4f51649"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 01 10:30:04 crc kubenswrapper[4792]: I0301 10:30:04.420127 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf24ca39-d196-4cdd-8521-da51a4f51649-kube-api-access-gf4pb" (OuterVolumeSpecName: "kube-api-access-gf4pb") pod "cf24ca39-d196-4cdd-8521-da51a4f51649" (UID: "cf24ca39-d196-4cdd-8521-da51a4f51649"). InnerVolumeSpecName "kube-api-access-gf4pb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:30:04 crc kubenswrapper[4792]: I0301 10:30:04.442580 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz"] Mar 01 10:30:04 crc kubenswrapper[4792]: I0301 10:30:04.455324 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29539305-ml8xz"] Mar 01 10:30:04 crc kubenswrapper[4792]: I0301 10:30:04.488182 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf4pb\" (UniqueName: \"kubernetes.io/projected/cf24ca39-d196-4cdd-8521-da51a4f51649-kube-api-access-gf4pb\") on node \"crc\" DevicePath \"\"" Mar 01 10:30:04 crc kubenswrapper[4792]: I0301 10:30:04.489828 4792 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf24ca39-d196-4cdd-8521-da51a4f51649-config-volume\") on node \"crc\" DevicePath \"\"" Mar 01 10:30:04 crc kubenswrapper[4792]: I0301 10:30:04.489939 4792 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf24ca39-d196-4cdd-8521-da51a4f51649-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 01 10:30:04 crc kubenswrapper[4792]: I0301 10:30:04.966856 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz" event={"ID":"cf24ca39-d196-4cdd-8521-da51a4f51649","Type":"ContainerDied","Data":"f62065739fd8378d875f429e28e19ba02569d3da54f6ca4c619f6d141a3465fe"} Mar 01 10:30:04 crc kubenswrapper[4792]: I0301 10:30:04.966896 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f62065739fd8378d875f429e28e19ba02569d3da54f6ca4c619f6d141a3465fe" Mar 01 10:30:04 crc kubenswrapper[4792]: I0301 10:30:04.968127 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29539350-hqxcz" Mar 01 10:30:04 crc kubenswrapper[4792]: I0301 10:30:04.970319 4792 generic.go:334] "Generic (PLEG): container finished" podID="f461ce0a-d106-4086-a698-987b95f5f03e" containerID="72dbc4f60f34ad4150df8188c6c0da347f12c8f2e84aa4c765ae93732fb406a9" exitCode=0 Mar 01 10:30:04 crc kubenswrapper[4792]: I0301 10:30:04.970367 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539350-vrwj8" event={"ID":"f461ce0a-d106-4086-a698-987b95f5f03e","Type":"ContainerDied","Data":"72dbc4f60f34ad4150df8188c6c0da347f12c8f2e84aa4c765ae93732fb406a9"} Mar 01 10:30:05 crc kubenswrapper[4792]: I0301 10:30:05.421176 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7a9ad8e-1c99-4a79-87eb-912aab1dc48c" path="/var/lib/kubelet/pods/e7a9ad8e-1c99-4a79-87eb-912aab1dc48c/volumes" Mar 01 10:30:06 crc kubenswrapper[4792]: I0301 10:30:06.312383 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539350-vrwj8" Mar 01 10:30:06 crc kubenswrapper[4792]: I0301 10:30:06.432694 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fdq4\" (UniqueName: \"kubernetes.io/projected/f461ce0a-d106-4086-a698-987b95f5f03e-kube-api-access-4fdq4\") pod \"f461ce0a-d106-4086-a698-987b95f5f03e\" (UID: \"f461ce0a-d106-4086-a698-987b95f5f03e\") " Mar 01 10:30:06 crc kubenswrapper[4792]: I0301 10:30:06.439343 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f461ce0a-d106-4086-a698-987b95f5f03e-kube-api-access-4fdq4" (OuterVolumeSpecName: "kube-api-access-4fdq4") pod "f461ce0a-d106-4086-a698-987b95f5f03e" (UID: "f461ce0a-d106-4086-a698-987b95f5f03e"). InnerVolumeSpecName "kube-api-access-4fdq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:30:06 crc kubenswrapper[4792]: I0301 10:30:06.535219 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fdq4\" (UniqueName: \"kubernetes.io/projected/f461ce0a-d106-4086-a698-987b95f5f03e-kube-api-access-4fdq4\") on node \"crc\" DevicePath \"\"" Mar 01 10:30:07 crc kubenswrapper[4792]: I0301 10:30:07.009812 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539350-vrwj8" event={"ID":"f461ce0a-d106-4086-a698-987b95f5f03e","Type":"ContainerDied","Data":"10ba251a37312235f0dee9803ff2e110d026416ee1d83aa9693bad44cb7d068e"} Mar 01 10:30:07 crc kubenswrapper[4792]: I0301 10:30:07.010064 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10ba251a37312235f0dee9803ff2e110d026416ee1d83aa9693bad44cb7d068e" Mar 01 10:30:07 crc kubenswrapper[4792]: I0301 10:30:07.010118 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539350-vrwj8" Mar 01 10:30:07 crc kubenswrapper[4792]: I0301 10:30:07.033340 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539344-r7v49"] Mar 01 10:30:07 crc kubenswrapper[4792]: I0301 10:30:07.062387 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539344-r7v49"] Mar 01 10:30:07 crc kubenswrapper[4792]: I0301 10:30:07.419122 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c5eb940-780b-4b4d-ab60-e1ad0c284811" path="/var/lib/kubelet/pods/6c5eb940-780b-4b4d-ab60-e1ad0c284811/volumes" Mar 01 10:30:10 crc kubenswrapper[4792]: I0301 10:30:10.640175 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-9smfd_e0b63d94-59de-45da-8058-89714bea7a90/control-plane-machine-set-operator/0.log" Mar 01 10:30:10 crc kubenswrapper[4792]: I0301 10:30:10.757048 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-nv4bp_51683a24-edad-4808-b2ec-6a628bfdd937/kube-rbac-proxy/0.log" Mar 01 10:30:10 crc kubenswrapper[4792]: I0301 10:30:10.824149 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-nv4bp_51683a24-edad-4808-b2ec-6a628bfdd937/machine-api-operator/0.log" Mar 01 10:30:16 crc kubenswrapper[4792]: I0301 10:30:16.410590 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:30:16 crc kubenswrapper[4792]: E0301 10:30:16.411444 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:30:23 crc kubenswrapper[4792]: I0301 10:30:23.056243 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-4qgsm_bf71ada0-c7b2-4255-bb2c-31ec3309a29d/cert-manager-controller/0.log" Mar 01 10:30:23 crc kubenswrapper[4792]: I0301 10:30:23.152368 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-tm5s6_2071887a-31a9-428d-92d0-bf8a361011ca/cert-manager-cainjector/0.log" Mar 01 10:30:23 crc kubenswrapper[4792]: I0301 10:30:23.235111 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-rckpb_a03eedd4-ecde-4905-95a7-c43b45ef9da9/cert-manager-webhook/0.log" Mar 01 10:30:27 crc kubenswrapper[4792]: I0301 10:30:27.409587 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:30:27 crc kubenswrapper[4792]: E0301 10:30:27.410875 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bqszv_openshift-machine-config-operator(9105f6b0-6f16-47aa-8009-73736a90b765)\"" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" Mar 01 10:30:35 crc kubenswrapper[4792]: I0301 10:30:35.813070 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-mtxkm_f7ca92c8-f38b-4a0a-b330-5809993cbb49/nmstate-console-plugin/0.log" Mar 01 10:30:36 crc kubenswrapper[4792]: I0301 10:30:36.033732 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-97cv9_bfe2cc56-28ca-4201-ba5a-4208dd1ec818/nmstate-metrics/0.log" Mar 01 10:30:36 crc kubenswrapper[4792]: I0301 10:30:36.041756 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-9j2tz_7105919f-ddac-45db-a8f7-bd927e5737df/nmstate-handler/0.log" Mar 01 10:30:36 crc kubenswrapper[4792]: I0301 10:30:36.065183 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-97cv9_bfe2cc56-28ca-4201-ba5a-4208dd1ec818/kube-rbac-proxy/0.log" Mar 01 10:30:36 crc kubenswrapper[4792]: I0301 10:30:36.350187 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-chfpw_fb942d1c-2a1a-4265-ae29-02f185d4cc40/nmstate-operator/0.log" Mar 01 10:30:36 crc kubenswrapper[4792]: I0301 10:30:36.416277 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-zwhpc_aa2300d6-10c0-4dc9-812a-fcb30f09920e/nmstate-webhook/0.log" Mar 01 10:30:40 crc kubenswrapper[4792]: I0301 10:30:40.451569 4792 scope.go:117] "RemoveContainer" containerID="1f71a688006db007bbde2ad2f2afb131f6dc80a772403871242301338cd9bc3e" Mar 01 10:30:40 crc kubenswrapper[4792]: I0301 10:30:40.516328 4792 scope.go:117] "RemoveContainer" containerID="5aa67c39154d74ad3f75b4616f3e6439b947759682dc6085d9ce37f8cd99894c" Mar 01 10:30:42 crc kubenswrapper[4792]: I0301 10:30:42.409135 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:30:43 crc kubenswrapper[4792]: I0301 10:30:43.295772 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"7b8e6b0aba17b92d064bcafd390c97d2064d3be27e2b966c778b962700333543"} Mar 01 10:31:09 crc kubenswrapper[4792]: I0301 10:31:09.054503 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-twxml_f73a6813-31ea-4018-bd23-45bf2f1dfe89/kube-rbac-proxy/0.log" Mar 01 10:31:09 crc kubenswrapper[4792]: I0301 10:31:09.138773 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-twxml_f73a6813-31ea-4018-bd23-45bf2f1dfe89/controller/0.log" Mar 01 10:31:09 crc kubenswrapper[4792]: I0301 10:31:09.259866 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-frr-files/0.log" Mar 01 10:31:09 crc kubenswrapper[4792]: I0301 10:31:09.411468 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-reloader/0.log" Mar 01 10:31:09 crc kubenswrapper[4792]: I0301 10:31:09.439172 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-frr-files/0.log" Mar 01 10:31:09 crc kubenswrapper[4792]: I0301 10:31:09.473843 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-metrics/0.log" Mar 01 10:31:09 crc kubenswrapper[4792]: I0301 10:31:09.507314 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-reloader/0.log" Mar 01 10:31:09 crc kubenswrapper[4792]: I0301 10:31:09.690336 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-frr-files/0.log" Mar 01 10:31:09 crc kubenswrapper[4792]: I0301 10:31:09.717232 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-metrics/0.log" Mar 01 10:31:09 crc kubenswrapper[4792]: I0301 10:31:09.722888 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-reloader/0.log" Mar 01 10:31:09 crc kubenswrapper[4792]: I0301 10:31:09.746794 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-metrics/0.log" Mar 01 10:31:09 crc kubenswrapper[4792]: I0301 10:31:09.912162 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-frr-files/0.log" Mar 01 10:31:09 crc kubenswrapper[4792]: I0301 10:31:09.916354 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-reloader/0.log" Mar 01 10:31:09 crc kubenswrapper[4792]: I0301 10:31:09.932692 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/cp-metrics/0.log" Mar 01 10:31:10 crc kubenswrapper[4792]: I0301 10:31:09.998795 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/controller/0.log" Mar 01 10:31:10 crc kubenswrapper[4792]: I0301 10:31:10.102582 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/frr-metrics/0.log" Mar 01 10:31:10 crc kubenswrapper[4792]: I0301 10:31:10.190660 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/kube-rbac-proxy/0.log" Mar 01 10:31:10 crc kubenswrapper[4792]: I0301 10:31:10.253671 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/kube-rbac-proxy-frr/0.log" Mar 01 10:31:10 crc kubenswrapper[4792]: I0301 10:31:10.287047 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/reloader/0.log" Mar 01 10:31:10 crc kubenswrapper[4792]: I0301 10:31:10.602348 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-kfnzk_d2f0572c-e661-495c-873c-6e2d18f2ab7d/frr-k8s-webhook-server/0.log" Mar 01 10:31:10 crc kubenswrapper[4792]: I0301 10:31:10.795736 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5cd84fcfbc-lrpmz_ba22e25a-31e8-4ca7-b169-f7433eda818b/manager/0.log" Mar 01 10:31:10 crc kubenswrapper[4792]: I0301 10:31:10.916064 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-776c7d78bd-jwfh6_cf86866e-8afa-44da-a688-e1c018a025bd/webhook-server/0.log" Mar 01 10:31:11 crc kubenswrapper[4792]: I0301 10:31:11.269844 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zpr27_8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7/kube-rbac-proxy/0.log" Mar 01 10:31:11 crc kubenswrapper[4792]: I0301 10:31:11.727783 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zpr27_8a9cb5a8-b4fb-4b1f-8bf3-68b1d01743f7/speaker/0.log" Mar 01 10:31:12 crc kubenswrapper[4792]: I0301 10:31:12.003735 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjh95_53127911-b831-4b3a-816d-ff8271118244/frr/0.log" Mar 01 10:31:25 crc kubenswrapper[4792]: I0301 10:31:25.579633 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd/util/0.log" Mar 01 10:31:25 crc kubenswrapper[4792]: I0301 10:31:25.699959 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd/util/0.log" Mar 01 10:31:25 crc kubenswrapper[4792]: I0301 10:31:25.769272 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd/pull/0.log" Mar 01 10:31:25 crc kubenswrapper[4792]: I0301 10:31:25.804633 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd/pull/0.log" Mar 01 10:31:25 crc kubenswrapper[4792]: I0301 10:31:25.952827 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd/extract/0.log" Mar 01 10:31:25 crc kubenswrapper[4792]: I0301 10:31:25.979239 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd/pull/0.log" Mar 01 10:31:25 crc kubenswrapper[4792]: I0301 10:31:25.992794 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82v4vst_8b2fbe4e-4a71-4ce1-b7cc-3063b89d65bd/util/0.log" Mar 01 10:31:26 crc kubenswrapper[4792]: I0301 10:31:26.857069 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sksw8_55c448d6-e926-4b07-8aec-8195d42d2e30/extract-utilities/0.log" Mar 01 10:31:26 crc kubenswrapper[4792]: I0301 10:31:26.960809 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sksw8_55c448d6-e926-4b07-8aec-8195d42d2e30/extract-utilities/0.log" Mar 01 10:31:26 crc kubenswrapper[4792]: I0301 10:31:26.980461 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sksw8_55c448d6-e926-4b07-8aec-8195d42d2e30/extract-content/0.log" Mar 01 10:31:27 crc kubenswrapper[4792]: I0301 10:31:27.031361 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sksw8_55c448d6-e926-4b07-8aec-8195d42d2e30/extract-content/0.log" Mar 01 10:31:27 crc kubenswrapper[4792]: I0301 10:31:27.256845 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sksw8_55c448d6-e926-4b07-8aec-8195d42d2e30/extract-utilities/0.log" Mar 01 10:31:27 crc kubenswrapper[4792]: I0301 10:31:27.262640 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sksw8_55c448d6-e926-4b07-8aec-8195d42d2e30/extract-content/0.log" Mar 01 10:31:27 crc kubenswrapper[4792]: I0301 10:31:27.474483 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-48zdf_d875f1af-e90b-4882-b472-f91651d468a6/extract-utilities/0.log" Mar 01 10:31:27 crc kubenswrapper[4792]: I0301 10:31:27.754116 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-48zdf_d875f1af-e90b-4882-b472-f91651d468a6/extract-utilities/0.log" Mar 01 10:31:27 crc kubenswrapper[4792]: I0301 10:31:27.790452 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sksw8_55c448d6-e926-4b07-8aec-8195d42d2e30/registry-server/0.log" Mar 01 10:31:27 crc kubenswrapper[4792]: I0301 10:31:27.831245 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-48zdf_d875f1af-e90b-4882-b472-f91651d468a6/extract-content/0.log" Mar 01 10:31:28 crc kubenswrapper[4792]: I0301 10:31:28.127291 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-48zdf_d875f1af-e90b-4882-b472-f91651d468a6/extract-content/0.log" Mar 01 10:31:28 crc kubenswrapper[4792]: I0301 10:31:28.314980 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-48zdf_d875f1af-e90b-4882-b472-f91651d468a6/extract-content/0.log" Mar 01 10:31:28 crc kubenswrapper[4792]: I0301 10:31:28.364663 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-48zdf_d875f1af-e90b-4882-b472-f91651d468a6/extract-utilities/0.log" Mar 01 10:31:28 crc kubenswrapper[4792]: I0301 10:31:28.646083 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t_a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4/util/0.log" Mar 01 10:31:28 crc kubenswrapper[4792]: I0301 10:31:28.912709 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t_a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4/pull/0.log" Mar 01 10:31:28 crc kubenswrapper[4792]: I0301 10:31:28.927448 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-48zdf_d875f1af-e90b-4882-b472-f91651d468a6/registry-server/0.log" Mar 01 10:31:28 crc kubenswrapper[4792]: I0301 10:31:28.978049 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t_a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4/pull/0.log" Mar 01 10:31:28 crc kubenswrapper[4792]: I0301 10:31:28.996896 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t_a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4/util/0.log" Mar 01 10:31:29 crc kubenswrapper[4792]: I0301 10:31:29.100711 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t_a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4/util/0.log" Mar 01 10:31:29 crc kubenswrapper[4792]: I0301 10:31:29.134520 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t_a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4/pull/0.log" Mar 01 10:31:29 crc kubenswrapper[4792]: I0301 10:31:29.175974 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46b25t_a7c3d28a-4f36-4a3c-a4f6-793a5f945cd4/extract/0.log" Mar 01 10:31:29 crc kubenswrapper[4792]: I0301 10:31:29.309755 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-gfkbs_46fe59e7-8122-4621-ae8d-237a91daee5e/marketplace-operator/0.log" Mar 01 10:31:29 crc kubenswrapper[4792]: I0301 10:31:29.384456 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zjvjw_3003e690-c3dd-4236-a95c-a0fb6ccb438e/extract-utilities/0.log" Mar 01 10:31:29 crc kubenswrapper[4792]: I0301 10:31:29.554706 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zjvjw_3003e690-c3dd-4236-a95c-a0fb6ccb438e/extract-content/0.log" Mar 01 10:31:29 crc kubenswrapper[4792]: I0301 10:31:29.556126 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zjvjw_3003e690-c3dd-4236-a95c-a0fb6ccb438e/extract-utilities/0.log" Mar 01 10:31:29 crc kubenswrapper[4792]: I0301 10:31:29.620231 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zjvjw_3003e690-c3dd-4236-a95c-a0fb6ccb438e/extract-content/0.log" Mar 01 10:31:29 crc kubenswrapper[4792]: I0301 10:31:29.841150 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zjvjw_3003e690-c3dd-4236-a95c-a0fb6ccb438e/extract-content/0.log" Mar 01 10:31:29 crc kubenswrapper[4792]: I0301 10:31:29.849080 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zjvjw_3003e690-c3dd-4236-a95c-a0fb6ccb438e/extract-utilities/0.log" Mar 01 10:31:29 crc kubenswrapper[4792]: I0301 10:31:29.943148 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zjvjw_3003e690-c3dd-4236-a95c-a0fb6ccb438e/registry-server/0.log" Mar 01 10:31:30 crc kubenswrapper[4792]: I0301 10:31:30.036425 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7fdqh_38bc0c09-286e-427a-95c2-8e2c9213b142/extract-utilities/0.log" Mar 01 10:31:30 crc kubenswrapper[4792]: I0301 10:31:30.183476 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7fdqh_38bc0c09-286e-427a-95c2-8e2c9213b142/extract-utilities/0.log" Mar 01 10:31:30 crc kubenswrapper[4792]: I0301 10:31:30.200838 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7fdqh_38bc0c09-286e-427a-95c2-8e2c9213b142/extract-content/0.log" Mar 01 10:31:30 crc kubenswrapper[4792]: I0301 10:31:30.237736 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7fdqh_38bc0c09-286e-427a-95c2-8e2c9213b142/extract-content/0.log" Mar 01 10:31:30 crc kubenswrapper[4792]: I0301 10:31:30.382163 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7fdqh_38bc0c09-286e-427a-95c2-8e2c9213b142/extract-utilities/0.log" Mar 01 10:31:30 crc kubenswrapper[4792]: I0301 10:31:30.382671 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7fdqh_38bc0c09-286e-427a-95c2-8e2c9213b142/extract-content/0.log" Mar 01 10:31:31 crc kubenswrapper[4792]: I0301 10:31:31.017840 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7fdqh_38bc0c09-286e-427a-95c2-8e2c9213b142/registry-server/0.log" Mar 01 10:32:00 crc kubenswrapper[4792]: I0301 10:32:00.142260 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539352-dwpdm"] Mar 01 10:32:00 crc kubenswrapper[4792]: E0301 10:32:00.142975 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f461ce0a-d106-4086-a698-987b95f5f03e" containerName="oc" Mar 01 10:32:00 crc kubenswrapper[4792]: I0301 10:32:00.142988 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f461ce0a-d106-4086-a698-987b95f5f03e" containerName="oc" Mar 01 10:32:00 crc kubenswrapper[4792]: E0301 10:32:00.142997 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf24ca39-d196-4cdd-8521-da51a4f51649" containerName="collect-profiles" Mar 01 10:32:00 crc kubenswrapper[4792]: I0301 10:32:00.143003 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf24ca39-d196-4cdd-8521-da51a4f51649" containerName="collect-profiles" Mar 01 10:32:00 crc kubenswrapper[4792]: I0301 10:32:00.143187 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f461ce0a-d106-4086-a698-987b95f5f03e" containerName="oc" Mar 01 10:32:00 crc kubenswrapper[4792]: I0301 10:32:00.143208 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf24ca39-d196-4cdd-8521-da51a4f51649" containerName="collect-profiles" Mar 01 10:32:00 crc kubenswrapper[4792]: I0301 10:32:00.143803 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539352-dwpdm" Mar 01 10:32:00 crc kubenswrapper[4792]: I0301 10:32:00.149115 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:32:00 crc kubenswrapper[4792]: I0301 10:32:00.149217 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:32:00 crc kubenswrapper[4792]: I0301 10:32:00.149379 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:32:00 crc kubenswrapper[4792]: I0301 10:32:00.176349 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539352-dwpdm"] Mar 01 10:32:00 crc kubenswrapper[4792]: I0301 10:32:00.242024 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s45ll\" (UniqueName: \"kubernetes.io/projected/a3569407-4c99-405b-801c-6b0378e1643b-kube-api-access-s45ll\") pod \"auto-csr-approver-29539352-dwpdm\" (UID: \"a3569407-4c99-405b-801c-6b0378e1643b\") " pod="openshift-infra/auto-csr-approver-29539352-dwpdm" Mar 01 10:32:00 crc kubenswrapper[4792]: I0301 10:32:00.343838 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s45ll\" (UniqueName: \"kubernetes.io/projected/a3569407-4c99-405b-801c-6b0378e1643b-kube-api-access-s45ll\") pod \"auto-csr-approver-29539352-dwpdm\" (UID: \"a3569407-4c99-405b-801c-6b0378e1643b\") " pod="openshift-infra/auto-csr-approver-29539352-dwpdm" Mar 01 10:32:00 crc kubenswrapper[4792]: I0301 10:32:00.363685 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s45ll\" (UniqueName: \"kubernetes.io/projected/a3569407-4c99-405b-801c-6b0378e1643b-kube-api-access-s45ll\") pod \"auto-csr-approver-29539352-dwpdm\" (UID: \"a3569407-4c99-405b-801c-6b0378e1643b\") " pod="openshift-infra/auto-csr-approver-29539352-dwpdm" Mar 01 10:32:00 crc kubenswrapper[4792]: I0301 10:32:00.493543 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539352-dwpdm" Mar 01 10:32:01 crc kubenswrapper[4792]: I0301 10:32:01.088218 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539352-dwpdm"] Mar 01 10:32:01 crc kubenswrapper[4792]: I0301 10:32:01.106542 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 01 10:32:02 crc kubenswrapper[4792]: I0301 10:32:02.022928 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539352-dwpdm" event={"ID":"a3569407-4c99-405b-801c-6b0378e1643b","Type":"ContainerStarted","Data":"34c7bb3f42e4933f2ed76f82d2443b6df65a41268d1ac12d3d1128bf2f7ce4c7"} Mar 01 10:32:03 crc kubenswrapper[4792]: I0301 10:32:03.032369 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539352-dwpdm" event={"ID":"a3569407-4c99-405b-801c-6b0378e1643b","Type":"ContainerStarted","Data":"59031c13708156c9066817a64f64803e8bd7d915e063b5c1f1224fd3d37f0722"} Mar 01 10:32:04 crc kubenswrapper[4792]: I0301 10:32:04.041395 4792 generic.go:334] "Generic (PLEG): container finished" podID="a3569407-4c99-405b-801c-6b0378e1643b" containerID="59031c13708156c9066817a64f64803e8bd7d915e063b5c1f1224fd3d37f0722" exitCode=0 Mar 01 10:32:04 crc kubenswrapper[4792]: I0301 10:32:04.041436 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539352-dwpdm" event={"ID":"a3569407-4c99-405b-801c-6b0378e1643b","Type":"ContainerDied","Data":"59031c13708156c9066817a64f64803e8bd7d915e063b5c1f1224fd3d37f0722"} Mar 01 10:32:05 crc kubenswrapper[4792]: I0301 10:32:05.465003 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539352-dwpdm" Mar 01 10:32:05 crc kubenswrapper[4792]: I0301 10:32:05.538113 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s45ll\" (UniqueName: \"kubernetes.io/projected/a3569407-4c99-405b-801c-6b0378e1643b-kube-api-access-s45ll\") pod \"a3569407-4c99-405b-801c-6b0378e1643b\" (UID: \"a3569407-4c99-405b-801c-6b0378e1643b\") " Mar 01 10:32:05 crc kubenswrapper[4792]: I0301 10:32:05.561177 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3569407-4c99-405b-801c-6b0378e1643b-kube-api-access-s45ll" (OuterVolumeSpecName: "kube-api-access-s45ll") pod "a3569407-4c99-405b-801c-6b0378e1643b" (UID: "a3569407-4c99-405b-801c-6b0378e1643b"). InnerVolumeSpecName "kube-api-access-s45ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:32:05 crc kubenswrapper[4792]: I0301 10:32:05.640356 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s45ll\" (UniqueName: \"kubernetes.io/projected/a3569407-4c99-405b-801c-6b0378e1643b-kube-api-access-s45ll\") on node \"crc\" DevicePath \"\"" Mar 01 10:32:06 crc kubenswrapper[4792]: I0301 10:32:06.064489 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539352-dwpdm" event={"ID":"a3569407-4c99-405b-801c-6b0378e1643b","Type":"ContainerDied","Data":"34c7bb3f42e4933f2ed76f82d2443b6df65a41268d1ac12d3d1128bf2f7ce4c7"} Mar 01 10:32:06 crc kubenswrapper[4792]: I0301 10:32:06.064779 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34c7bb3f42e4933f2ed76f82d2443b6df65a41268d1ac12d3d1128bf2f7ce4c7" Mar 01 10:32:06 crc kubenswrapper[4792]: I0301 10:32:06.064557 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539352-dwpdm" Mar 01 10:32:06 crc kubenswrapper[4792]: I0301 10:32:06.110241 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539346-l8ktk"] Mar 01 10:32:06 crc kubenswrapper[4792]: I0301 10:32:06.123220 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539346-l8ktk"] Mar 01 10:32:07 crc kubenswrapper[4792]: I0301 10:32:07.420306 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c23503b-d97f-4cef-b792-e7fbdd8934ab" path="/var/lib/kubelet/pods/9c23503b-d97f-4cef-b792-e7fbdd8934ab/volumes" Mar 01 10:32:40 crc kubenswrapper[4792]: I0301 10:32:40.613514 4792 scope.go:117] "RemoveContainer" containerID="f2c5b6ef4792c5289a47b67d38a5bcb076b8c29acc57acf629d74f960223e4cf" Mar 01 10:33:04 crc kubenswrapper[4792]: I0301 10:33:04.942630 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:33:04 crc kubenswrapper[4792]: I0301 10:33:04.943463 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:33:34 crc kubenswrapper[4792]: I0301 10:33:34.943455 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:33:34 crc kubenswrapper[4792]: I0301 10:33:34.944015 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:33:40 crc kubenswrapper[4792]: I0301 10:33:40.707028 4792 scope.go:117] "RemoveContainer" containerID="9b1ac8b2a6b9a9734637c133cb7e1fc37defae1380b1bd14a5fb31b1efa6e0e5" Mar 01 10:33:56 crc kubenswrapper[4792]: I0301 10:33:56.203320 4792 generic.go:334] "Generic (PLEG): container finished" podID="f3c98b67-7926-411d-9068-0b7991b0551b" containerID="08ddb8424f25ab955f23fb248fb3199c701786d1289df5f9635fe0b5ba6df5a8" exitCode=0 Mar 01 10:33:56 crc kubenswrapper[4792]: I0301 10:33:56.203549 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gqc89/must-gather-vdffg" event={"ID":"f3c98b67-7926-411d-9068-0b7991b0551b","Type":"ContainerDied","Data":"08ddb8424f25ab955f23fb248fb3199c701786d1289df5f9635fe0b5ba6df5a8"} Mar 01 10:33:56 crc kubenswrapper[4792]: I0301 10:33:56.204578 4792 scope.go:117] "RemoveContainer" containerID="08ddb8424f25ab955f23fb248fb3199c701786d1289df5f9635fe0b5ba6df5a8" Mar 01 10:33:56 crc kubenswrapper[4792]: I0301 10:33:56.542062 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gqc89_must-gather-vdffg_f3c98b67-7926-411d-9068-0b7991b0551b/gather/0.log" Mar 01 10:34:00 crc kubenswrapper[4792]: I0301 10:34:00.146778 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539354-xht7z"] Mar 01 10:34:00 crc kubenswrapper[4792]: E0301 10:34:00.147659 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3569407-4c99-405b-801c-6b0378e1643b" containerName="oc" Mar 01 10:34:00 crc kubenswrapper[4792]: I0301 10:34:00.147671 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3569407-4c99-405b-801c-6b0378e1643b" containerName="oc" Mar 01 10:34:00 crc kubenswrapper[4792]: I0301 10:34:00.147848 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3569407-4c99-405b-801c-6b0378e1643b" containerName="oc" Mar 01 10:34:00 crc kubenswrapper[4792]: I0301 10:34:00.148527 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539354-xht7z" Mar 01 10:34:00 crc kubenswrapper[4792]: I0301 10:34:00.150419 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:34:00 crc kubenswrapper[4792]: I0301 10:34:00.152750 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:34:00 crc kubenswrapper[4792]: I0301 10:34:00.156091 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:34:00 crc kubenswrapper[4792]: I0301 10:34:00.168441 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539354-xht7z"] Mar 01 10:34:00 crc kubenswrapper[4792]: I0301 10:34:00.221410 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9ckp\" (UniqueName: \"kubernetes.io/projected/cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6-kube-api-access-b9ckp\") pod \"auto-csr-approver-29539354-xht7z\" (UID: \"cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6\") " pod="openshift-infra/auto-csr-approver-29539354-xht7z" Mar 01 10:34:00 crc kubenswrapper[4792]: I0301 10:34:00.331268 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9ckp\" (UniqueName: \"kubernetes.io/projected/cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6-kube-api-access-b9ckp\") pod \"auto-csr-approver-29539354-xht7z\" (UID: \"cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6\") " pod="openshift-infra/auto-csr-approver-29539354-xht7z" Mar 01 10:34:00 crc kubenswrapper[4792]: I0301 10:34:00.576712 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9ckp\" (UniqueName: \"kubernetes.io/projected/cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6-kube-api-access-b9ckp\") pod \"auto-csr-approver-29539354-xht7z\" (UID: \"cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6\") " pod="openshift-infra/auto-csr-approver-29539354-xht7z" Mar 01 10:34:00 crc kubenswrapper[4792]: I0301 10:34:00.767691 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539354-xht7z" Mar 01 10:34:01 crc kubenswrapper[4792]: I0301 10:34:01.253205 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539354-xht7z"] Mar 01 10:34:02 crc kubenswrapper[4792]: I0301 10:34:02.251924 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539354-xht7z" event={"ID":"cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6","Type":"ContainerStarted","Data":"7f87b9ff5f5cebd3fa4f8cc04ab1fa49ab1105e996fc5ed9853ba70b1a8e1d9a"} Mar 01 10:34:03 crc kubenswrapper[4792]: I0301 10:34:03.262883 4792 generic.go:334] "Generic (PLEG): container finished" podID="cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6" containerID="18f7a1fbb5db0372beb7cb8583c564353109011b20fbac6efa4b77f8d451a33b" exitCode=0 Mar 01 10:34:03 crc kubenswrapper[4792]: I0301 10:34:03.263088 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539354-xht7z" event={"ID":"cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6","Type":"ContainerDied","Data":"18f7a1fbb5db0372beb7cb8583c564353109011b20fbac6efa4b77f8d451a33b"} Mar 01 10:34:04 crc kubenswrapper[4792]: I0301 10:34:04.694368 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539354-xht7z" Mar 01 10:34:04 crc kubenswrapper[4792]: I0301 10:34:04.842986 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9ckp\" (UniqueName: \"kubernetes.io/projected/cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6-kube-api-access-b9ckp\") pod \"cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6\" (UID: \"cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6\") " Mar 01 10:34:04 crc kubenswrapper[4792]: I0301 10:34:04.849807 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6-kube-api-access-b9ckp" (OuterVolumeSpecName: "kube-api-access-b9ckp") pod "cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6" (UID: "cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6"). InnerVolumeSpecName "kube-api-access-b9ckp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:34:04 crc kubenswrapper[4792]: I0301 10:34:04.942589 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:34:04 crc kubenswrapper[4792]: I0301 10:34:04.942641 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:34:04 crc kubenswrapper[4792]: I0301 10:34:04.942679 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" Mar 01 10:34:04 crc kubenswrapper[4792]: I0301 10:34:04.943413 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7b8e6b0aba17b92d064bcafd390c97d2064d3be27e2b966c778b962700333543"} pod="openshift-machine-config-operator/machine-config-daemon-bqszv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 01 10:34:04 crc kubenswrapper[4792]: I0301 10:34:04.943464 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" containerID="cri-o://7b8e6b0aba17b92d064bcafd390c97d2064d3be27e2b966c778b962700333543" gracePeriod=600 Mar 01 10:34:04 crc kubenswrapper[4792]: I0301 10:34:04.944728 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9ckp\" (UniqueName: \"kubernetes.io/projected/cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6-kube-api-access-b9ckp\") on node \"crc\" DevicePath \"\"" Mar 01 10:34:05 crc kubenswrapper[4792]: I0301 10:34:05.302007 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539354-xht7z" event={"ID":"cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6","Type":"ContainerDied","Data":"7f87b9ff5f5cebd3fa4f8cc04ab1fa49ab1105e996fc5ed9853ba70b1a8e1d9a"} Mar 01 10:34:05 crc kubenswrapper[4792]: I0301 10:34:05.302054 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f87b9ff5f5cebd3fa4f8cc04ab1fa49ab1105e996fc5ed9853ba70b1a8e1d9a" Mar 01 10:34:05 crc kubenswrapper[4792]: I0301 10:34:05.302122 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539354-xht7z" Mar 01 10:34:05 crc kubenswrapper[4792]: I0301 10:34:05.306207 4792 generic.go:334] "Generic (PLEG): container finished" podID="9105f6b0-6f16-47aa-8009-73736a90b765" containerID="7b8e6b0aba17b92d064bcafd390c97d2064d3be27e2b966c778b962700333543" exitCode=0 Mar 01 10:34:05 crc kubenswrapper[4792]: I0301 10:34:05.306247 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerDied","Data":"7b8e6b0aba17b92d064bcafd390c97d2064d3be27e2b966c778b962700333543"} Mar 01 10:34:05 crc kubenswrapper[4792]: I0301 10:34:05.306274 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" event={"ID":"9105f6b0-6f16-47aa-8009-73736a90b765","Type":"ContainerStarted","Data":"73191cfccc209b70bde4b4703193d041d53f629db3567c2baa8fb03d0794d6ef"} Mar 01 10:34:05 crc kubenswrapper[4792]: I0301 10:34:05.306289 4792 scope.go:117] "RemoveContainer" containerID="6c562d0736d2f1d47c96f057fbf023fb570ad43189229c38648e9554b553cb43" Mar 01 10:34:05 crc kubenswrapper[4792]: I0301 10:34:05.765405 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539348-5hvql"] Mar 01 10:34:05 crc kubenswrapper[4792]: I0301 10:34:05.777240 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539348-5hvql"] Mar 01 10:34:07 crc kubenswrapper[4792]: I0301 10:34:07.422156 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5867157f-b16a-460c-afc4-0981a4d8ee43" path="/var/lib/kubelet/pods/5867157f-b16a-460c-afc4-0981a4d8ee43/volumes" Mar 01 10:34:11 crc kubenswrapper[4792]: I0301 10:34:11.626591 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gqc89/must-gather-vdffg"] Mar 01 10:34:11 crc kubenswrapper[4792]: I0301 10:34:11.627326 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-gqc89/must-gather-vdffg" podUID="f3c98b67-7926-411d-9068-0b7991b0551b" containerName="copy" containerID="cri-o://3a427244043269002de52e0a079a3760bf74195c9fc73cf6abf599f3980c7560" gracePeriod=2 Mar 01 10:34:11 crc kubenswrapper[4792]: I0301 10:34:11.638925 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gqc89/must-gather-vdffg"] Mar 01 10:34:12 crc kubenswrapper[4792]: I0301 10:34:12.060620 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gqc89_must-gather-vdffg_f3c98b67-7926-411d-9068-0b7991b0551b/copy/0.log" Mar 01 10:34:12 crc kubenswrapper[4792]: I0301 10:34:12.060934 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gqc89/must-gather-vdffg" Mar 01 10:34:12 crc kubenswrapper[4792]: I0301 10:34:12.198919 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcq2x\" (UniqueName: \"kubernetes.io/projected/f3c98b67-7926-411d-9068-0b7991b0551b-kube-api-access-mcq2x\") pod \"f3c98b67-7926-411d-9068-0b7991b0551b\" (UID: \"f3c98b67-7926-411d-9068-0b7991b0551b\") " Mar 01 10:34:12 crc kubenswrapper[4792]: I0301 10:34:12.199000 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f3c98b67-7926-411d-9068-0b7991b0551b-must-gather-output\") pod \"f3c98b67-7926-411d-9068-0b7991b0551b\" (UID: \"f3c98b67-7926-411d-9068-0b7991b0551b\") " Mar 01 10:34:12 crc kubenswrapper[4792]: I0301 10:34:12.203914 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3c98b67-7926-411d-9068-0b7991b0551b-kube-api-access-mcq2x" (OuterVolumeSpecName: "kube-api-access-mcq2x") pod "f3c98b67-7926-411d-9068-0b7991b0551b" (UID: "f3c98b67-7926-411d-9068-0b7991b0551b"). InnerVolumeSpecName "kube-api-access-mcq2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:34:12 crc kubenswrapper[4792]: I0301 10:34:12.301031 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcq2x\" (UniqueName: \"kubernetes.io/projected/f3c98b67-7926-411d-9068-0b7991b0551b-kube-api-access-mcq2x\") on node \"crc\" DevicePath \"\"" Mar 01 10:34:12 crc kubenswrapper[4792]: I0301 10:34:12.361817 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3c98b67-7926-411d-9068-0b7991b0551b-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "f3c98b67-7926-411d-9068-0b7991b0551b" (UID: "f3c98b67-7926-411d-9068-0b7991b0551b"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:34:12 crc kubenswrapper[4792]: I0301 10:34:12.375727 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gqc89_must-gather-vdffg_f3c98b67-7926-411d-9068-0b7991b0551b/copy/0.log" Mar 01 10:34:12 crc kubenswrapper[4792]: I0301 10:34:12.376021 4792 generic.go:334] "Generic (PLEG): container finished" podID="f3c98b67-7926-411d-9068-0b7991b0551b" containerID="3a427244043269002de52e0a079a3760bf74195c9fc73cf6abf599f3980c7560" exitCode=143 Mar 01 10:34:12 crc kubenswrapper[4792]: I0301 10:34:12.376073 4792 scope.go:117] "RemoveContainer" containerID="3a427244043269002de52e0a079a3760bf74195c9fc73cf6abf599f3980c7560" Mar 01 10:34:12 crc kubenswrapper[4792]: I0301 10:34:12.376198 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gqc89/must-gather-vdffg" Mar 01 10:34:12 crc kubenswrapper[4792]: I0301 10:34:12.398557 4792 scope.go:117] "RemoveContainer" containerID="08ddb8424f25ab955f23fb248fb3199c701786d1289df5f9635fe0b5ba6df5a8" Mar 01 10:34:12 crc kubenswrapper[4792]: I0301 10:34:12.402364 4792 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f3c98b67-7926-411d-9068-0b7991b0551b-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 01 10:34:12 crc kubenswrapper[4792]: I0301 10:34:12.457543 4792 scope.go:117] "RemoveContainer" containerID="3a427244043269002de52e0a079a3760bf74195c9fc73cf6abf599f3980c7560" Mar 01 10:34:12 crc kubenswrapper[4792]: E0301 10:34:12.457934 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a427244043269002de52e0a079a3760bf74195c9fc73cf6abf599f3980c7560\": container with ID starting with 3a427244043269002de52e0a079a3760bf74195c9fc73cf6abf599f3980c7560 not found: ID does not exist" containerID="3a427244043269002de52e0a079a3760bf74195c9fc73cf6abf599f3980c7560" Mar 01 10:34:12 crc kubenswrapper[4792]: I0301 10:34:12.457977 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a427244043269002de52e0a079a3760bf74195c9fc73cf6abf599f3980c7560"} err="failed to get container status \"3a427244043269002de52e0a079a3760bf74195c9fc73cf6abf599f3980c7560\": rpc error: code = NotFound desc = could not find container \"3a427244043269002de52e0a079a3760bf74195c9fc73cf6abf599f3980c7560\": container with ID starting with 3a427244043269002de52e0a079a3760bf74195c9fc73cf6abf599f3980c7560 not found: ID does not exist" Mar 01 10:34:12 crc kubenswrapper[4792]: I0301 10:34:12.458005 4792 scope.go:117] "RemoveContainer" containerID="08ddb8424f25ab955f23fb248fb3199c701786d1289df5f9635fe0b5ba6df5a8" Mar 01 10:34:12 crc kubenswrapper[4792]: E0301 10:34:12.458384 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08ddb8424f25ab955f23fb248fb3199c701786d1289df5f9635fe0b5ba6df5a8\": container with ID starting with 08ddb8424f25ab955f23fb248fb3199c701786d1289df5f9635fe0b5ba6df5a8 not found: ID does not exist" containerID="08ddb8424f25ab955f23fb248fb3199c701786d1289df5f9635fe0b5ba6df5a8" Mar 01 10:34:12 crc kubenswrapper[4792]: I0301 10:34:12.458425 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08ddb8424f25ab955f23fb248fb3199c701786d1289df5f9635fe0b5ba6df5a8"} err="failed to get container status \"08ddb8424f25ab955f23fb248fb3199c701786d1289df5f9635fe0b5ba6df5a8\": rpc error: code = NotFound desc = could not find container \"08ddb8424f25ab955f23fb248fb3199c701786d1289df5f9635fe0b5ba6df5a8\": container with ID starting with 08ddb8424f25ab955f23fb248fb3199c701786d1289df5f9635fe0b5ba6df5a8 not found: ID does not exist" Mar 01 10:34:13 crc kubenswrapper[4792]: I0301 10:34:13.428219 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3c98b67-7926-411d-9068-0b7991b0551b" path="/var/lib/kubelet/pods/f3c98b67-7926-411d-9068-0b7991b0551b/volumes" Mar 01 10:34:40 crc kubenswrapper[4792]: I0301 10:34:40.921791 4792 scope.go:117] "RemoveContainer" containerID="9237ca7f55124284e9b80295a7be3e3ee8987057df25870f237eb05840050933" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.113202 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q26sh"] Mar 01 10:35:55 crc kubenswrapper[4792]: E0301 10:35:55.118623 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3c98b67-7926-411d-9068-0b7991b0551b" containerName="gather" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.118884 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3c98b67-7926-411d-9068-0b7991b0551b" containerName="gather" Mar 01 10:35:55 crc kubenswrapper[4792]: E0301 10:35:55.119010 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3c98b67-7926-411d-9068-0b7991b0551b" containerName="copy" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.119094 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3c98b67-7926-411d-9068-0b7991b0551b" containerName="copy" Mar 01 10:35:55 crc kubenswrapper[4792]: E0301 10:35:55.119189 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6" containerName="oc" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.119268 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6" containerName="oc" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.119571 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3c98b67-7926-411d-9068-0b7991b0551b" containerName="gather" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.119678 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3c98b67-7926-411d-9068-0b7991b0551b" containerName="copy" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.119783 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc6c842e-8a6b-4db8-b8b2-af0e9e9d64c6" containerName="oc" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.121529 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q26sh" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.130571 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q26sh"] Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.147948 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7r8m\" (UniqueName: \"kubernetes.io/projected/c2a39635-b289-4498-8fe7-89dd096cd6b7-kube-api-access-m7r8m\") pod \"redhat-marketplace-q26sh\" (UID: \"c2a39635-b289-4498-8fe7-89dd096cd6b7\") " pod="openshift-marketplace/redhat-marketplace-q26sh" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.148186 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2a39635-b289-4498-8fe7-89dd096cd6b7-utilities\") pod \"redhat-marketplace-q26sh\" (UID: \"c2a39635-b289-4498-8fe7-89dd096cd6b7\") " pod="openshift-marketplace/redhat-marketplace-q26sh" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.148231 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2a39635-b289-4498-8fe7-89dd096cd6b7-catalog-content\") pod \"redhat-marketplace-q26sh\" (UID: \"c2a39635-b289-4498-8fe7-89dd096cd6b7\") " pod="openshift-marketplace/redhat-marketplace-q26sh" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.249605 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2a39635-b289-4498-8fe7-89dd096cd6b7-utilities\") pod \"redhat-marketplace-q26sh\" (UID: \"c2a39635-b289-4498-8fe7-89dd096cd6b7\") " pod="openshift-marketplace/redhat-marketplace-q26sh" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.249666 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2a39635-b289-4498-8fe7-89dd096cd6b7-catalog-content\") pod \"redhat-marketplace-q26sh\" (UID: \"c2a39635-b289-4498-8fe7-89dd096cd6b7\") " pod="openshift-marketplace/redhat-marketplace-q26sh" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.249734 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7r8m\" (UniqueName: \"kubernetes.io/projected/c2a39635-b289-4498-8fe7-89dd096cd6b7-kube-api-access-m7r8m\") pod \"redhat-marketplace-q26sh\" (UID: \"c2a39635-b289-4498-8fe7-89dd096cd6b7\") " pod="openshift-marketplace/redhat-marketplace-q26sh" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.250173 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2a39635-b289-4498-8fe7-89dd096cd6b7-utilities\") pod \"redhat-marketplace-q26sh\" (UID: \"c2a39635-b289-4498-8fe7-89dd096cd6b7\") " pod="openshift-marketplace/redhat-marketplace-q26sh" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.250293 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2a39635-b289-4498-8fe7-89dd096cd6b7-catalog-content\") pod \"redhat-marketplace-q26sh\" (UID: \"c2a39635-b289-4498-8fe7-89dd096cd6b7\") " pod="openshift-marketplace/redhat-marketplace-q26sh" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.271799 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7r8m\" (UniqueName: \"kubernetes.io/projected/c2a39635-b289-4498-8fe7-89dd096cd6b7-kube-api-access-m7r8m\") pod \"redhat-marketplace-q26sh\" (UID: \"c2a39635-b289-4498-8fe7-89dd096cd6b7\") " pod="openshift-marketplace/redhat-marketplace-q26sh" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.447430 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q26sh" Mar 01 10:35:55 crc kubenswrapper[4792]: I0301 10:35:55.972719 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q26sh"] Mar 01 10:35:56 crc kubenswrapper[4792]: I0301 10:35:56.390796 4792 generic.go:334] "Generic (PLEG): container finished" podID="c2a39635-b289-4498-8fe7-89dd096cd6b7" containerID="614c20a304f51f6b29c6201e2d4a764ee9d3e3b0dedf3183d0763288ac384659" exitCode=0 Mar 01 10:35:56 crc kubenswrapper[4792]: I0301 10:35:56.390954 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q26sh" event={"ID":"c2a39635-b289-4498-8fe7-89dd096cd6b7","Type":"ContainerDied","Data":"614c20a304f51f6b29c6201e2d4a764ee9d3e3b0dedf3183d0763288ac384659"} Mar 01 10:35:56 crc kubenswrapper[4792]: I0301 10:35:56.391376 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q26sh" event={"ID":"c2a39635-b289-4498-8fe7-89dd096cd6b7","Type":"ContainerStarted","Data":"5edeba1a0067c6ac8c2c982f4caebc08f5998e30ac052774d0d5008a752a8a8f"} Mar 01 10:35:57 crc kubenswrapper[4792]: I0301 10:35:57.405463 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q26sh" event={"ID":"c2a39635-b289-4498-8fe7-89dd096cd6b7","Type":"ContainerStarted","Data":"865dacb19b31d57f85dd3c75502964ee8529dd4b64b612af715d2c4d0c996a48"} Mar 01 10:35:58 crc kubenswrapper[4792]: I0301 10:35:58.420832 4792 generic.go:334] "Generic (PLEG): container finished" podID="c2a39635-b289-4498-8fe7-89dd096cd6b7" containerID="865dacb19b31d57f85dd3c75502964ee8529dd4b64b612af715d2c4d0c996a48" exitCode=0 Mar 01 10:35:58 crc kubenswrapper[4792]: I0301 10:35:58.421039 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q26sh" event={"ID":"c2a39635-b289-4498-8fe7-89dd096cd6b7","Type":"ContainerDied","Data":"865dacb19b31d57f85dd3c75502964ee8529dd4b64b612af715d2c4d0c996a48"} Mar 01 10:35:59 crc kubenswrapper[4792]: I0301 10:35:59.433876 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q26sh" event={"ID":"c2a39635-b289-4498-8fe7-89dd096cd6b7","Type":"ContainerStarted","Data":"dfb56846a9c9011248501381ec8f8dc238f1066b0d1c5de121342cb87f451be3"} Mar 01 10:35:59 crc kubenswrapper[4792]: I0301 10:35:59.458613 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q26sh" podStartSLOduration=2.04419399 podStartE2EDuration="4.458589849s" podCreationTimestamp="2026-03-01 10:35:55 +0000 UTC" firstStartedPulling="2026-03-01 10:35:56.39302662 +0000 UTC m=+5285.634905817" lastFinishedPulling="2026-03-01 10:35:58.807422489 +0000 UTC m=+5288.049301676" observedRunningTime="2026-03-01 10:35:59.45262213 +0000 UTC m=+5288.694501327" watchObservedRunningTime="2026-03-01 10:35:59.458589849 +0000 UTC m=+5288.700469046" Mar 01 10:36:00 crc kubenswrapper[4792]: I0301 10:36:00.167610 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29539356-2s5vx"] Mar 01 10:36:00 crc kubenswrapper[4792]: I0301 10:36:00.169713 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539356-2s5vx" Mar 01 10:36:00 crc kubenswrapper[4792]: I0301 10:36:00.172496 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k8xpz" Mar 01 10:36:00 crc kubenswrapper[4792]: I0301 10:36:00.175363 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 01 10:36:00 crc kubenswrapper[4792]: I0301 10:36:00.175507 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 01 10:36:00 crc kubenswrapper[4792]: I0301 10:36:00.188656 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539356-2s5vx"] Mar 01 10:36:00 crc kubenswrapper[4792]: I0301 10:36:00.279219 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvjcf\" (UniqueName: \"kubernetes.io/projected/a9cf147c-d70d-4595-bc95-e97ed6d5e6e4-kube-api-access-wvjcf\") pod \"auto-csr-approver-29539356-2s5vx\" (UID: \"a9cf147c-d70d-4595-bc95-e97ed6d5e6e4\") " pod="openshift-infra/auto-csr-approver-29539356-2s5vx" Mar 01 10:36:00 crc kubenswrapper[4792]: I0301 10:36:00.381670 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvjcf\" (UniqueName: \"kubernetes.io/projected/a9cf147c-d70d-4595-bc95-e97ed6d5e6e4-kube-api-access-wvjcf\") pod \"auto-csr-approver-29539356-2s5vx\" (UID: \"a9cf147c-d70d-4595-bc95-e97ed6d5e6e4\") " pod="openshift-infra/auto-csr-approver-29539356-2s5vx" Mar 01 10:36:00 crc kubenswrapper[4792]: I0301 10:36:00.401741 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvjcf\" (UniqueName: \"kubernetes.io/projected/a9cf147c-d70d-4595-bc95-e97ed6d5e6e4-kube-api-access-wvjcf\") pod \"auto-csr-approver-29539356-2s5vx\" (UID: \"a9cf147c-d70d-4595-bc95-e97ed6d5e6e4\") " pod="openshift-infra/auto-csr-approver-29539356-2s5vx" Mar 01 10:36:00 crc kubenswrapper[4792]: I0301 10:36:00.497075 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539356-2s5vx" Mar 01 10:36:00 crc kubenswrapper[4792]: I0301 10:36:00.993486 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29539356-2s5vx"] Mar 01 10:36:01 crc kubenswrapper[4792]: I0301 10:36:01.458118 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539356-2s5vx" event={"ID":"a9cf147c-d70d-4595-bc95-e97ed6d5e6e4","Type":"ContainerStarted","Data":"bc1c3b0b371c7e7bdc178d978e2d171b704f2ab3100c1cf4de3a74a7575130a8"} Mar 01 10:36:02 crc kubenswrapper[4792]: I0301 10:36:02.469895 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539356-2s5vx" event={"ID":"a9cf147c-d70d-4595-bc95-e97ed6d5e6e4","Type":"ContainerStarted","Data":"44e705e0a477a0d8f3cbeb58cb0807914e2a2315e9879ce644f415ed2c08652b"} Mar 01 10:36:02 crc kubenswrapper[4792]: I0301 10:36:02.489639 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29539356-2s5vx" podStartSLOduration=1.552427459 podStartE2EDuration="2.489612314s" podCreationTimestamp="2026-03-01 10:36:00 +0000 UTC" firstStartedPulling="2026-03-01 10:36:01.005711002 +0000 UTC m=+5290.247590199" lastFinishedPulling="2026-03-01 10:36:01.942895857 +0000 UTC m=+5291.184775054" observedRunningTime="2026-03-01 10:36:02.485498221 +0000 UTC m=+5291.727377418" watchObservedRunningTime="2026-03-01 10:36:02.489612314 +0000 UTC m=+5291.731491511" Mar 01 10:36:03 crc kubenswrapper[4792]: I0301 10:36:03.479144 4792 generic.go:334] "Generic (PLEG): container finished" podID="a9cf147c-d70d-4595-bc95-e97ed6d5e6e4" containerID="44e705e0a477a0d8f3cbeb58cb0807914e2a2315e9879ce644f415ed2c08652b" exitCode=0 Mar 01 10:36:03 crc kubenswrapper[4792]: I0301 10:36:03.479201 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539356-2s5vx" event={"ID":"a9cf147c-d70d-4595-bc95-e97ed6d5e6e4","Type":"ContainerDied","Data":"44e705e0a477a0d8f3cbeb58cb0807914e2a2315e9879ce644f415ed2c08652b"} Mar 01 10:36:04 crc kubenswrapper[4792]: I0301 10:36:04.847191 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539356-2s5vx" Mar 01 10:36:04 crc kubenswrapper[4792]: I0301 10:36:04.912759 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvjcf\" (UniqueName: \"kubernetes.io/projected/a9cf147c-d70d-4595-bc95-e97ed6d5e6e4-kube-api-access-wvjcf\") pod \"a9cf147c-d70d-4595-bc95-e97ed6d5e6e4\" (UID: \"a9cf147c-d70d-4595-bc95-e97ed6d5e6e4\") " Mar 01 10:36:04 crc kubenswrapper[4792]: I0301 10:36:04.922887 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9cf147c-d70d-4595-bc95-e97ed6d5e6e4-kube-api-access-wvjcf" (OuterVolumeSpecName: "kube-api-access-wvjcf") pod "a9cf147c-d70d-4595-bc95-e97ed6d5e6e4" (UID: "a9cf147c-d70d-4595-bc95-e97ed6d5e6e4"). InnerVolumeSpecName "kube-api-access-wvjcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:36:05 crc kubenswrapper[4792]: I0301 10:36:05.015501 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvjcf\" (UniqueName: \"kubernetes.io/projected/a9cf147c-d70d-4595-bc95-e97ed6d5e6e4-kube-api-access-wvjcf\") on node \"crc\" DevicePath \"\"" Mar 01 10:36:05 crc kubenswrapper[4792]: I0301 10:36:05.448254 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q26sh" Mar 01 10:36:05 crc kubenswrapper[4792]: I0301 10:36:05.450197 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q26sh" Mar 01 10:36:05 crc kubenswrapper[4792]: I0301 10:36:05.498696 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29539356-2s5vx" Mar 01 10:36:05 crc kubenswrapper[4792]: I0301 10:36:05.499058 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29539356-2s5vx" event={"ID":"a9cf147c-d70d-4595-bc95-e97ed6d5e6e4","Type":"ContainerDied","Data":"bc1c3b0b371c7e7bdc178d978e2d171b704f2ab3100c1cf4de3a74a7575130a8"} Mar 01 10:36:05 crc kubenswrapper[4792]: I0301 10:36:05.499172 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc1c3b0b371c7e7bdc178d978e2d171b704f2ab3100c1cf4de3a74a7575130a8" Mar 01 10:36:05 crc kubenswrapper[4792]: I0301 10:36:05.537071 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q26sh" Mar 01 10:36:05 crc kubenswrapper[4792]: I0301 10:36:05.581604 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29539350-vrwj8"] Mar 01 10:36:05 crc kubenswrapper[4792]: I0301 10:36:05.597552 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29539350-vrwj8"] Mar 01 10:36:06 crc kubenswrapper[4792]: I0301 10:36:06.580026 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q26sh" Mar 01 10:36:06 crc kubenswrapper[4792]: I0301 10:36:06.640327 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q26sh"] Mar 01 10:36:07 crc kubenswrapper[4792]: I0301 10:36:07.419184 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f461ce0a-d106-4086-a698-987b95f5f03e" path="/var/lib/kubelet/pods/f461ce0a-d106-4086-a698-987b95f5f03e/volumes" Mar 01 10:36:08 crc kubenswrapper[4792]: I0301 10:36:08.539076 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-q26sh" podUID="c2a39635-b289-4498-8fe7-89dd096cd6b7" containerName="registry-server" containerID="cri-o://dfb56846a9c9011248501381ec8f8dc238f1066b0d1c5de121342cb87f451be3" gracePeriod=2 Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.076347 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q26sh" Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.205777 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2a39635-b289-4498-8fe7-89dd096cd6b7-utilities\") pod \"c2a39635-b289-4498-8fe7-89dd096cd6b7\" (UID: \"c2a39635-b289-4498-8fe7-89dd096cd6b7\") " Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.205896 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7r8m\" (UniqueName: \"kubernetes.io/projected/c2a39635-b289-4498-8fe7-89dd096cd6b7-kube-api-access-m7r8m\") pod \"c2a39635-b289-4498-8fe7-89dd096cd6b7\" (UID: \"c2a39635-b289-4498-8fe7-89dd096cd6b7\") " Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.206005 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2a39635-b289-4498-8fe7-89dd096cd6b7-catalog-content\") pod \"c2a39635-b289-4498-8fe7-89dd096cd6b7\" (UID: \"c2a39635-b289-4498-8fe7-89dd096cd6b7\") " Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.206640 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2a39635-b289-4498-8fe7-89dd096cd6b7-utilities" (OuterVolumeSpecName: "utilities") pod "c2a39635-b289-4498-8fe7-89dd096cd6b7" (UID: "c2a39635-b289-4498-8fe7-89dd096cd6b7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.211083 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2a39635-b289-4498-8fe7-89dd096cd6b7-kube-api-access-m7r8m" (OuterVolumeSpecName: "kube-api-access-m7r8m") pod "c2a39635-b289-4498-8fe7-89dd096cd6b7" (UID: "c2a39635-b289-4498-8fe7-89dd096cd6b7"). InnerVolumeSpecName "kube-api-access-m7r8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.231877 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2a39635-b289-4498-8fe7-89dd096cd6b7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2a39635-b289-4498-8fe7-89dd096cd6b7" (UID: "c2a39635-b289-4498-8fe7-89dd096cd6b7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.307891 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2a39635-b289-4498-8fe7-89dd096cd6b7-utilities\") on node \"crc\" DevicePath \"\"" Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.308124 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7r8m\" (UniqueName: \"kubernetes.io/projected/c2a39635-b289-4498-8fe7-89dd096cd6b7-kube-api-access-m7r8m\") on node \"crc\" DevicePath \"\"" Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.308219 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2a39635-b289-4498-8fe7-89dd096cd6b7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.551002 4792 generic.go:334] "Generic (PLEG): container finished" podID="c2a39635-b289-4498-8fe7-89dd096cd6b7" containerID="dfb56846a9c9011248501381ec8f8dc238f1066b0d1c5de121342cb87f451be3" exitCode=0 Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.551051 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q26sh" event={"ID":"c2a39635-b289-4498-8fe7-89dd096cd6b7","Type":"ContainerDied","Data":"dfb56846a9c9011248501381ec8f8dc238f1066b0d1c5de121342cb87f451be3"} Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.551084 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q26sh" event={"ID":"c2a39635-b289-4498-8fe7-89dd096cd6b7","Type":"ContainerDied","Data":"5edeba1a0067c6ac8c2c982f4caebc08f5998e30ac052774d0d5008a752a8a8f"} Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.551105 4792 scope.go:117] "RemoveContainer" containerID="dfb56846a9c9011248501381ec8f8dc238f1066b0d1c5de121342cb87f451be3" Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.551057 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q26sh" Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.582977 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q26sh"] Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.587379 4792 scope.go:117] "RemoveContainer" containerID="865dacb19b31d57f85dd3c75502964ee8529dd4b64b612af715d2c4d0c996a48" Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.597884 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-q26sh"] Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.608817 4792 scope.go:117] "RemoveContainer" containerID="614c20a304f51f6b29c6201e2d4a764ee9d3e3b0dedf3183d0763288ac384659" Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.654764 4792 scope.go:117] "RemoveContainer" containerID="dfb56846a9c9011248501381ec8f8dc238f1066b0d1c5de121342cb87f451be3" Mar 01 10:36:09 crc kubenswrapper[4792]: E0301 10:36:09.655211 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfb56846a9c9011248501381ec8f8dc238f1066b0d1c5de121342cb87f451be3\": container with ID starting with dfb56846a9c9011248501381ec8f8dc238f1066b0d1c5de121342cb87f451be3 not found: ID does not exist" containerID="dfb56846a9c9011248501381ec8f8dc238f1066b0d1c5de121342cb87f451be3" Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.655242 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfb56846a9c9011248501381ec8f8dc238f1066b0d1c5de121342cb87f451be3"} err="failed to get container status \"dfb56846a9c9011248501381ec8f8dc238f1066b0d1c5de121342cb87f451be3\": rpc error: code = NotFound desc = could not find container \"dfb56846a9c9011248501381ec8f8dc238f1066b0d1c5de121342cb87f451be3\": container with ID starting with dfb56846a9c9011248501381ec8f8dc238f1066b0d1c5de121342cb87f451be3 not found: ID does not exist" Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.655264 4792 scope.go:117] "RemoveContainer" containerID="865dacb19b31d57f85dd3c75502964ee8529dd4b64b612af715d2c4d0c996a48" Mar 01 10:36:09 crc kubenswrapper[4792]: E0301 10:36:09.655983 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"865dacb19b31d57f85dd3c75502964ee8529dd4b64b612af715d2c4d0c996a48\": container with ID starting with 865dacb19b31d57f85dd3c75502964ee8529dd4b64b612af715d2c4d0c996a48 not found: ID does not exist" containerID="865dacb19b31d57f85dd3c75502964ee8529dd4b64b612af715d2c4d0c996a48" Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.656004 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"865dacb19b31d57f85dd3c75502964ee8529dd4b64b612af715d2c4d0c996a48"} err="failed to get container status \"865dacb19b31d57f85dd3c75502964ee8529dd4b64b612af715d2c4d0c996a48\": rpc error: code = NotFound desc = could not find container \"865dacb19b31d57f85dd3c75502964ee8529dd4b64b612af715d2c4d0c996a48\": container with ID starting with 865dacb19b31d57f85dd3c75502964ee8529dd4b64b612af715d2c4d0c996a48 not found: ID does not exist" Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.656018 4792 scope.go:117] "RemoveContainer" containerID="614c20a304f51f6b29c6201e2d4a764ee9d3e3b0dedf3183d0763288ac384659" Mar 01 10:36:09 crc kubenswrapper[4792]: E0301 10:36:09.656400 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"614c20a304f51f6b29c6201e2d4a764ee9d3e3b0dedf3183d0763288ac384659\": container with ID starting with 614c20a304f51f6b29c6201e2d4a764ee9d3e3b0dedf3183d0763288ac384659 not found: ID does not exist" containerID="614c20a304f51f6b29c6201e2d4a764ee9d3e3b0dedf3183d0763288ac384659" Mar 01 10:36:09 crc kubenswrapper[4792]: I0301 10:36:09.656585 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"614c20a304f51f6b29c6201e2d4a764ee9d3e3b0dedf3183d0763288ac384659"} err="failed to get container status \"614c20a304f51f6b29c6201e2d4a764ee9d3e3b0dedf3183d0763288ac384659\": rpc error: code = NotFound desc = could not find container \"614c20a304f51f6b29c6201e2d4a764ee9d3e3b0dedf3183d0763288ac384659\": container with ID starting with 614c20a304f51f6b29c6201e2d4a764ee9d3e3b0dedf3183d0763288ac384659 not found: ID does not exist" Mar 01 10:36:11 crc kubenswrapper[4792]: I0301 10:36:11.427815 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2a39635-b289-4498-8fe7-89dd096cd6b7" path="/var/lib/kubelet/pods/c2a39635-b289-4498-8fe7-89dd096cd6b7/volumes" Mar 01 10:36:34 crc kubenswrapper[4792]: I0301 10:36:34.951526 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:36:34 crc kubenswrapper[4792]: I0301 10:36:34.952148 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 01 10:36:41 crc kubenswrapper[4792]: I0301 10:36:41.029702 4792 scope.go:117] "RemoveContainer" containerID="72dbc4f60f34ad4150df8188c6c0da347f12c8f2e84aa4c765ae93732fb406a9" Mar 01 10:37:04 crc kubenswrapper[4792]: I0301 10:37:04.942956 4792 patch_prober.go:28] interesting pod/machine-config-daemon-bqszv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 01 10:37:04 crc kubenswrapper[4792]: I0301 10:37:04.943793 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bqszv" podUID="9105f6b0-6f16-47aa-8009-73736a90b765" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"